var/home/core/zuul-output/0000755000175000017500000000000015146714532014535 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015146720054015475 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000227700615146717704020300 0ustar corecoreğikubelet.log_o[;r)Br'o-n(!9t%Cs7}g/غIs,r.k9Gf􅔟 "mv?_eGbuuțx{w7ݭ7֫'m% oo/q3m^]/o?8.7oW}ʋghewx/mX,ojŻ ^Tb3b#׳:}=p7뼝ca㑔`e0I1Q!&ѱ[/o^{W-{t3_U|6 x)K#/5ΌR"ggóisR)N %emOQ/Ϋ[oa0vs68/Jʢ ܚʂ9ss3+aô٥J}{37FEbп3 FKX1QRQlrTvb)E,s)Wɀ;$#LcdHM%vz_. o~I|3j dF{ "IΩ?PF~J~ ` 17ׅwڋًM)$Fiqw7Gt7L"u 0V9c  ˹dvYļU[ Z.׿-h QZ*U1|t5wKOؾ{mk b2 ܨ;RJK!b>JR*kl|+"N'C_#a7]d]sJg;;>Yp׫,w`ɚ'd$ecwŻ^~7EpQС3DCS[Yʧ?DDS aw߾)VxX帟AB}nyи0stĈCo.:wAZ{sy:7qsWctx{}n-+ZYsI{/.Ra9XcђQ0FK@aEDO2es ׇN# ZF͹b,*YVi+$<QMGhC}^}?BqG!(8l K3T[<~6]90}(*T7siv'=k 9Q2@vN ( R['>v*;o57sp$3ncx!>t®W>]tF-iܪ%GYbaRvHa}dkD̶*';ک|s_}8yj,('GrgTZ'U鋊TqOſ * /Ijo!՟8`"j}zӲ$k3jS|C7;A)͎V.r?t\WU1ojjr<~Tq> `=tJ!aݡ=h6Yݭw}?lѹ`f_" J9w4ts7NG GGG]ҡgc⌝M b/Ζlpah E ur C&`XR JcwB~R2EL9j7e\(Uё$׿atyХ?*t5z\+`/ErVQUxMҔ&ۈt.3;eg_O ξL1KiYLizpV:C5/=v-}҅"o ']쌕|tϓX8nJ*A*%J[T2pI1Je;s_[,Ҩ38_ь ͰM0ImY/MiVJ5&jNgBt90v߁R:~U jځU~oN9xԞ~J|dݤ߯R> kH&Y``:"s ayiBq)u%'4 yܽ yW0 -i̭uJ{KưЖ@+UBj -&JO x@}DS.€>3T0|9ē7$3z^.I< )9qf e%dhy:O40n'c}c1XҸuFiƠIkaIx( +")OtZ l^Z^CQ6tffEmDφǽ{QiOENG{P;sHz"G- >+`قSᔙD'Ad ѭj( ہO r:91v|ɛr|٦/o{C Ӹ!uWȳ)gjw&+uߕt*:͵UMQrN@fYDtEYZb4-UCqK٪L.2teB ˛"ո{Gci`du듎q+;C'16FgVlWaaB)"F,u@30YQg˾_YҊŏ#_f^ TD=VAKNl4Kš4GScѦa0 J ()¾5m'p/\խX\=z,Mw˭x:qu礛WԓL!I xӤ1(5AKRVF2ɌУլ F "vuhc=JS\kkZAY`R"Hr1]%oR[^oI]${&L8<=#0yaKL: JJl r;t#H+B|ɧJiM cm)>H=l}.^\ݧM<lu Y> XH\z:dHElL(uHR0i#q%]!=t_쾋-, vW~* ^g/5n]FhNU˿oۂ6C9C7sn,kje*;iΓA7,Q)-,=1A sK|ۜLɽy]ʸEO<-YEqKzϢ \{>dDLF amKGm+`VLJsC>?5rk{-3Ss`y_C}Q v,{*)ߎ% qƦat:D=uNvdߋ{Ny[$ {ɴ6hOI']dC5`t9:GO: FmlN*:g^;T^B0$B%C6Θ%|5u=kkN2{'FEc* A>{avdt)8|mg定TN7,TEXt+`F P |ɧ<Ғ8_iqE b}$B#fethBE;1"l r  B+R6Qp%;R8P󦟶Ub-L::;Ⱦ7,VW.JE:PgXoΰUv:ΰdɆΰ (ΰ0eTUgXun[g, ׽-t!X򴱞_aM:E.Qg1DllЊE҉L ehJx{̗Uɾ?si&2"C]u$.`mjmƒVe9f6NŐsLu6fe wkىKR%f"6=rw^)'Hz }x>1yFX09'A%bDb0!i(`Z;TyֻΗ|ִ0-6dAC5t[OM91c:VJR9&ksvJ;0ɝ$krogB= FYtЩOte=?>T&O{Ll)HClba1PIFĀ":tu^}.&R*!^pHPQuSVO$.KMb.:DK>WtWǭKv4@Va3"a`R@gbu%_J5Ґ 3?mmcK/&|ty'Sb'SN&s#$PAEw@,8-K߿^}n.|!p+,ICE^fu `|M3J#BQȌ6DNnCˣ"F$/Qx%m&FK_7P|٢?I-RiAKoQrMI}0BOnYr猸p$nuݣRF]NHw2kp}lrCy u)xF$Z83Ec罋}[εUX%>}< ݳln"sv&{b%^AAoۺ(I#hKD:Bߩ#蘈f=9oN*.Ѓ M#JC1J~]ps/9܎ms4gZY-07`-Id,9õ԰t+-b[uemNi_󈛥^g+!SKq<>78NBx;c4<ニ)H .Pd^cR^p_G+E--ۥ_F]a|v@|3p%kzh|k*BBRib\J3Yn|뇱[FfP%M:<`pz?]6laz5`ZQs{>3ư_o%oU׆]YLz_s߭AF'is^_&uUm$[[5HI4QCZ5!N&D[uiXk&2Bg&Ս7_/6v_cd=d@eU XyX2z>g8:.⺻h()&nO5YE\1t7aSyFxPV19 ĕi%K"IcB j>Pm[E[^OHmmU̸nG pHKZ{{Qo}i¿Xc\]e1e,5`te.5Hhao<[50wMUF􀍠P?W篬3yb.63>NKnJۦ^4*rB쑓:5Ǧ٨C.1`mU]+>i+m^CM&WTj7ȗE!NC6P}H`c(FUM gul)b ;2n6'k}ˍ[`-fYX_pL +1wu(#'3"fxsuҮױdy.0]?=b+ uV4}rdM$ѢIA$3~Lvi{u+]NC5ÿ nNჶT@~~ܥ 7-mU,\rXmQALglNʆ P7k%v>"WCyVtnV K`pCfE?~fjBwU&'ᚡilPї`m] leu]+?T4v\% ;qF0qV(]pP4W =d#t ru\M{Nj[ppH? 8>X+m7_Z`V j[ s3n/i{ 6uwŇct<= pDCm3-b _F(/f<8sBdimV-L^C{0lS|IJe" cѲj Ak-ڶxIuҐqI$6ʎ@lbx\<uV?.*E!qQ5m㎤9I͸,0E.ŊygcEl#L)(g4^atNbe7}v+7Zo>W?%TbzK-6cb:XeG_hl&0Ɠbb_2++oI~!&-[TWvxZ>4(s{z1v&YN2姟d4"?oWNW݃yho~%DTt^W7q.@ L⃳662G,:* $: e~7[/P%F on$dƹɥO"dޢt|BpYqc@P`ڄj҆anCѢMU sf`Y*_h/{͊0:L5:)u3wI$G}qsd*꓎0]TGF[vJ+ Rjv<Ҋ(.GGzpFL`1CS$Ǥ46i*#zL9tT :<XK*ɤ{ U܋N5 l͖h"褁l^=UF^BcAw`g*7R(#ғ [K&#Mp'XގL=s5^:z7y0^} "NqK$2$ Ri ?2,ᙌEK@-V3ʱd:/4Kwm2$'dW<qIE2Ľ)5kJҼMЌ DR3csf6rRSr[I߽ogCc;S5ׂdKZ=M3դ#F;SYƘK`K<<ƛ GnjMU.APf\M*t*vg]xo{:l[n=`smFQµtxx7/G%g!&^=SxDNew(æ*m3D Bo.hI"!A6:uQզ}@j=Mo<}nYUw1Xw:]e/sm lˣaVۤkĨdԖ)RtS2 "E I"{;ōCb{yex&Td >@).p$`XKxnX~E膂Og\IGֻq@wY)aL5^1 W9&3JW(7b ?)]m`F3 W((!5F-9]dDqL&RΖd}})7 k11 K ;%v'_3 dG8d t#MTU']h7^)O>?~?_|&q̑0dd4>vk 60D _o~[Ww3ckpkpLNa ^j 5*<&}kˢmqvۗj=<Tr=[ a^؃ È(<^ٽcZ7C:?pM z*"#窾+ HsOt۩%A498SwWv|jNQ=-[ӓImjX( Q&sľ$k$QpgV TJlE=;ʰ6Y+ #TXAڗIf8qoj39Cw;(Zհ>ڂDЬ$(1HM9AidC! >' }XW1/4sCd0ɢuoNu*<*Zadzy̋6\BykdNXCgQY5<|SyJ2s,IgkTdru[$l l8)wֻHƒ#=g waRUۣO]|;{(^jN1_0b.e]WlfB6xnVYmw캳Ǧ1s:(moG=|\NSa %gPJtER|xEVG`$c-{R!h9 G:5BB,%3@猍Ѹ~>gA%o #y'믇g/^)1π<*V0b}0dk] kW?,gXv#-↷4.pu֑z:C%X$ax՞R PѰ/W.okx}=t.ɘ=tn ǬV`]ZZ 7w㘹 <9 J"' YC R@1)t+pt19NL!eU_RnsN·S3')㝑g Q=CX9 &A?6g Wg+69e].L4ԫy{kih\bM"Cq֍觮LPuD -Q49>AG^GH5*}O.F]a{c049iϺ0Oh`z>uJGX5P&4$Q`%+% @=>86=p s#n1)kzXJA#ywI.xk* %XK>=H~ Nj= T:ĄB6d](p oFWY8ɴA6b22& 2g}ߜ3L"zP[5~6cLN7wiÎ*r@txvBɾsgÁx AeОy!,xOi|o!c=hLXETZ^0G)YǓ<6m0}W")XQĿ ڳ7Mz\+mB@{|{*iaaDHg4l@[`jM{|<+- ko;钌] (3ߺhTD=v%vr ,b`L !;b]D)yu|+l2/fMj."źAyһdVקg? >}<<}]7fvhwxO."RF %?Vp\Ri`bKu;&HAǻWw1cwG6"{*jGEe_2xRSMU3> -S ƅyC^ hR$jɴ1"ެ7pr픆vf*Mg}p SONJ{U-=qGSB1[a3@5@ #:zf$,9" +e}x{z6 s0ɮH "!{f:Tm?LzF-"8$~r)p5`c6w{B/dIzn1~`ݝ(NH" NבNGJA+:1{$%Fy6Sn1L7j"0/I+R#uّ*[]}Kѱ(zr;0W:- V1T!cd idȚ1|mL`\PF~]g="@8  }Gs2uŝ9yك}K6)I{D=/'86%#FJ6in?YܴХem_"Nv^ܶ}>Z y9%{˻MMX{ROoCzK7W쫮yhe^`ɢ%e)&8J,{tNp>Cs3Tpu1cd>F舘 gB7kx@ D_e(1ב?:ׇ^.: qMkлUޭ֐Y±lqLb^"7 {iK,6vT{'f%Z]H>ĻwQdkJH!U潼RbM(zdD7_:I/ىYU ^-W;MRI=5.[G$.fP:^p}z7n:㾱^z!X+DA댉V 7fx-0vE&2^`dׁ{Qez˜sW՗M,`)/c>wVӟpgeD0~Wol*MŒafuw(ӼD6p82.Hby['Vjݼong->\HMK翼:=mu;ߵo:Oq>fx=s_u[5fr7ulXΖh.@NttP=zav[ ٟ(Uxwk>wfq?_ʰob]?ﳊ<[0G !5=9Z ^Rq0f*8!N%ggIyG~y}k^c$3-0'1@x>(=lRX<0_6gP%]ۡC{CV6ݞpփ B^s~=YjNO_X:hMх ޡ/?&H/Gx?܆$XvNTYHJ°A֢!.3OmyưUlTv1X~m--﵌+b*$0oG#6}ݡA ^Z9U_mZ? -gxتr OO~͜ fD{BVZq'(@Gbm.b@#Hnafb2#(D_}-a?Dޞ_id"NOtfMIk~^."7%~_>?hJ%L;&ORc&zCM6k.Kd?k~P 'UBfE{-̸[BZ#Th,R4&$CР էG2ǫWO)Xg`W;<8PBA+Tt+U5@Ww[Y|sM'M_Q`\VǴU >žBZ%N}fbnf:K*2={WdQ˖jUF'PBbEK% {(ɓ-ȑ!F*kB6>nuVb%ko.h'%k?S{;'{~KSmȜfivoId ilf,I:Ū`U@wBfh@2E1Z`&{"PP&3iXe|YHSC&y|B)L,㞉2G1&S>-E`5=}2ݔޏ%uϦq/&)o9[!?. ^:HE w[^͆Je3>ǻ(o&ÐUiK<;1$%_5R5)~p=Ob?_6 + q5BHjLu _:JH-aDQ20ι׋|yBYA'3dvZdU{KVVVҩ WY·ٷod"Y PH*_)'lDR2Ȝ~Ie!K+G0!T riLI9\(R x!/*,|ԭGaHKR2D.$S2bT>L7 ZpQo>~r$1&?읉8FHޗ4DKK v2UTMUӜ=Bɫ0v1}|1cAuO)bXyRD `MHΛVu_$kcUh=Ɩ9bݛ&G'&zh“5h0Va(ΗBQGgCXt!*~Ga>FGdp- ]O3]J bXeXacKo;0*$a6 +nɮD >:֋[@ QDe,]R1]T ZvƂcW+dύ-7m4e0ϕ{ 6K!x^>$ 4ۏ l 2JDOLЩ LkJu\!`甉܋)`ŰV28%;fHoQVbapO@B59@mޯtG^n阬iԴ.2w⠪Ri_"6GX| w,?Vo7o}Cwқ5k7:vvmi8WLfT18V 3| lXQ!L7,R3PE08kvwmSH[f@!o\IVicoM~ar.EtIbW>߷/*~qzJJ\aQ#-~`XY阡ǝRS>r,CJYXzv[ezkKA`<dkqHNo_U!*pNiJه5B5H:WztK@MR:Y5ΟUh "`"a ߒ"G̾H [nCk(O rSwvҍuA+Qm0c:QZ]/1bdæ_DQP/2 re%_bn%"s#PCoT/*,:[4b=]N!rVo%¢EN$iԱ)e\rxac8{ =CNc\E)7$%LO./!Z&p:ˏ!_Lb a|D>{N{Vt:S4q>i Ǟ/"8+MIm(_,Xi.gL'—T1ZWJPU~l!.Wetr:-DJ|njߪA!wY~ -`%/Űb`\S38WUGۓVmlTccc[O`u pb>Gȱ ҫ+al HOAi\fw$olٝ:^Izq) ٽƎDjlKٻBc5S&ڽUalf@ Ve`D~ڇAځQi5ʬL^Huoa_u` 1>Z;I"YPۊCxbIa{ sc[',:n%Ld="K-us0d/#X.?ߒםh 2r=/oID3d ֺ"?yUAmE}~Grݮ@!&H2D+vVyKZt<c&kxu7ʍ-`Zi3)|x!6%%<@fpѻK2Q 1pFP=TU?!$VQp7% 8$ c*K "U8V15> =҆xɮDپ U`w۸ہ#t#|X!~A:"W vzZ U{ TĩG /z!~^<}NY !!E%ҏ:H =VՑLf*n6tGd#fR*c ^,—R9wN?3}resLV̼d"I ve,Jm_u)d靕َ"4pj2褴ƅblæC?Fw.IfpNV Ѵ)nsX CplvD.yΕ`2d ;WBTD\UlkF瘏" i '2S-5z+YCrE~b>|Ž6Oj~ebIapul%s,igG\ZqScxTi!r}_R \5eAl G-3&X ޴-kOw:9~Z"+A<sXnRcxKV(#lj@`^RoL;IQŸŢތXD@Zu\QbRN s>U^Olz3;Q_EU0u0Kr#r|%s5Ww` q>12e_ʿd{zlzUܟlDU j>zƖݗ!0 hDԘFLUb.u6lX)"}lj.b :|XU O\_JK\?2:uGL.xllT_oiqq$Xdy-Qd)R]n4[Put2 QCppS@pȎ gƖ^̹\B~IM 2sO/I!}(1nlz1H^[Iv%clh]I^z KٕlB l5`:~Bc>q,7}VE-QW70up˳ AM ytlx)킖h?.āUEJIu-tq5ӂ~ яr@/1EUyz8ǖQqrGm𻾏-[fTsqY~ sjZ+9[nu9YFzY(R+9Xu>sY~ ae9} x. zj!Z\ļǪЎr8B*!Fѡv8\[|osk[hW+/=:1={6]I)|Өq9| 76"Q;F*04Zٚ ?V ͼr?a`57X $UاD94mc5]? %E_jB+^4|Q#c5U+A.j$*Lc55eXmaj9P;,r^f1VC)XDEEYz'0Z(IWyQ>7sǭ^~^%442V)`"G)`F $~ mmZӊy:A)%Sa F;M!c5۟WW%yK69_̟I<(GѦYY\S k0Joƪt^/Kʰ.)}]R:qxA,ɥhZ`6Ξj`XL]RYs^T!s[7|^K*Ҙ*PC]Ơms/5a;!5C#uZMRdi^\B[&pt㟔˲᷋ɿi!ls19fY5crj**SgLjb6Qb` =yDCI6[4/RB6FBv7g:?TWuJկtJTʰ׶S5{U3~uG(5`ă^{Suñ5}ĵBIImC˱*N%?M衾F İd""JlvwdvQ-hT.sG0Gٗ>;yTe݋%)<,gv 6$]̧꽨O4"$YPڮ3R._&+x1-л_(-P Vy)^O7CdyJe}$t]װ曁2jaM5Ss=WeR9ay1jg(IՐ~DE Ɓ'#Uq9j~g*Vy$68K6i I:pA(L's&˲̊KӠ.f(-uD:i>XLxTj`UZ{\CLxΚ0KӉ V3B?ys+&adVe%ȯDxf4w~4^@`p`@^vQ/Ɏ,ʹy`sl:9~Ar4 o~x!Xx³ .lc{;ߛz)V>4WT|59y!S(ĝ"Aъ4Q1tgi f9.ݟLl kZ CkR%03/zo1o\y5W KS&2g/痿Oϓ0l{EJ&G eRӌa'a%OQ|%Qxx /a'x}Z8SBq7/>(&UEY]5T~8 ÝuZ<`(:t>]~voĴ?gM`[ 1EQ,}wy~͢E6e\=s8eY4S {xDp zZYG{_y^V"?DLcX$R= CQuk|K3\~M;\9)zN!#9GW1~nHb^ƅDňdMqcŴ- ۍr :|BE,!pu j?3~AA@q gYs G(7Fryd0s}/Y.uJ{Ű(6(˅ q\%07>Mf#"8 O=`e|Թ4Ht)QӜ/ݷ ϳ(a#f3_8gi^ev;9~<@QAj=Ͽ60P4i`H#vY:k }YP\-F3\dzKD]/eJ7nHpSY9O9_4M*4WlmKsy+?aeU"Lп:o6ԦU0tD~ќ#m(_S(2}.҄(25#l[k"OWBAB>W%Im[SFQ7v?i˴Q9tЎXqaD2;"H__?v(Rp,' m7x-F*)H iPF?pM[:NB'zw%"fY)q.(yyzH]۞2-2fre^$**؝ZP-pDSL nȺN ߠw9}7GVUޗM}z24 0C,' M|SF؜X2}d#yz몲w9۔a/`*Th؝4/C%Xn'\ ʶ@-ʷ`:wQȈ WKǡy M,ᔍ lX4͒AB|`]R0%hOxOs :8$K"w_p;: tŋ4pU\r\LnTp/D8'o%~x#r۳ +?u!Y%D|? u%$$fEF1".0C!;jdEdQx"Tw4لUJ7`>tt L<(ƅN3soMX"5JZoHO8fw| ڎ‘v%S K`NyXl(_Cw@_-R4C}uBϳ`Cȩy20$bm]gy!d.Ps W\U^ XqM ڷ $ X#BRE)߼'eS,=M^bzKvEoacֻ8Lp3IRVvtf3*v`e`,-/,\N7Ĵٮ djD^Z$V+qk:ApD =v%XHxV zvd ;۪*I\{J )!#L[d`pKC sb :]>SiUk28ЊD"t͒\ b"`9] GXYwcV|л=})caH|XC=Ϗß.}zb?@$0hpu4-A5Oj;t{!G[vhkp.@ӈMov J& |W E"NQW]F=)V%% -5X,OJ'y8@YZ N% V>=Vum[CSګfi)+s*K[[LPΎ:8R5ôBOtCbkfF8zIzFo h Z9Op%l-6[|-rK,(7I@1jÒt.f,ٹoi1Tɖ948:IE\֤mB/QlYg XKF9>5?鼡d'K mZqxkפl` l?*l&і7؂5=Ε˖Fz[@4eIp/GET)bk}\=ÄɪS]U7,[w^7mVq~m|Ov;˞lQq-#oRA{_rxWē<_dKF008u7AsHȟn #4XS7Pb&yL@ jAjgӷek^pa*oe!vx |qsO>e /+6DxءJifH87an8F)7\geCjJ6;Z*{^gՇ 'vREWg)*ջGU-RfU>q\f̉Q[3M?"NȾK6QN2?kַYѭf)MzWcF^[w{<ӏgOo^oS ^6ڞXbq؎Ԯ{vs;5$J.)-83H f%tbgnI;* 4$ )Lwu\)[w +chۉZ 6D>ݹM`n?%3a, E=Wj $nHP߅ )%^j'0S vv8#صZʐ8 gp /B.F"`^ *^$oɶ`% #$!/#i$[Qv2'Ķ4]H! "l2v!{7PTDX;lH$!Lx K@mxK&J(H$"xc %H(;0m1|Nba~P \It%a;qG;^/!#Q 3_I !D ƦqN屵TMwLb06f8z)\z>cwGyQGy&ބm&N1mż *_|wm$f%o`j[=]2 (]s XЕnOںFTr:Bi-{ &mWZd~Mm^Y 'zv͋YW^L^=IV9eݦ}vؑ@N ̰-őrF9cA9f@RQ*[_/3KD ׌Cp8p^\ň&%oYUY[ [PJJ_81+Go/JV8")"EQJ5lN8Đl6rN&<8zO@sVUg+")<,;3Fl uZd\^{l\5!LOLG݅rLVK }VD`NkF*vNBa'C0G˸iw7ZZ$*&I:L$*XKqReϪ>18`\oQ!'GpJC:9)^½x) <aeA]Y2vRY(_~/t,pqCKl]0qp-_LH_ɄHwVHjȯ&ѷ ߧ%ȕy%UV%eTl2d7?ȂՖĨ.W M;Yߌ 4>k>@s)>}LOdM3J7OY /׎ތ iLA]L99895+Ҳ:: ?@p$LJ|rC8'Ww5nc (p0]w@_ rM_F' /^PU#/ïGŨwU؟Ns}ϯ/OTpQy~R7)Q'p}O%A[v^- ,By> 1w0`q6m/,!gBl'brb=?BOFzAQmf~؋Qo,1 xq4AfXJF,x3s0Y&Q?>%?ǵzy`M'qxM,2V$30LT"`)YREbN`{N|8Ktר( l q5U/վbȆ(u`V᥊"k6KQp4UaP1\7PG>[,v`ok=kIe$ڇκNG2!^m'Vk%,j󵵽$:LuуK[o2]iƑ!PXuOqyvѐF.y򩜖pv.v}JnѪOWpNO /`[m{JDcy]gAkˠu+꺢.wxʞHU9΋ARAҬhCEyҀm~~!U < 6 mN({lKB&Rɘ-=sTٶ6n3ۭw ʟG*ߒR*ڛj?P{KB uV u6 ٜPy:[l@J愺#ݒPwBUB 6'{ޖzo@9$߀``B Gh%nNhHv-L⨰*ZcJWt">sp>gޅ`T,΋X7Oaxg 0:#b~+2ψÂ~ѠL9G!NjArxrK>xp ~`T}S7b>*В:Nb*. {Жy&?q:KIc4?| S-Cb!Ǐe%ROD) L -1:£߿a0hx>@K/ĖUl5)ʒPEEP$f[8WJcu Hիqd_KG-!5OT2kJP6^#xf mYُ=/JI# ?=Y=~HQ @żmP%³ؤ8u⎻pgTQ:<2&(BGD3@7_@ Vx?Bg wLYx ȠN5Eb2pA}.&Br0o kV@ ,-+KRRdYGbZ(bbqKb(?[rzYr<_F"tYIaXc修Y_D]6#G?=we^}~::g<1@\J^`7؈nb< ږ {ܜ6z$9#<`$0.킿BdNMϻ;vj8D|C$\݊4A j) i%xκPѯ*"tGM&4pW5$Hs%Nu1]LR??X#ܔmO0ڼ=`xƾ9 "ӫiiHj説g=q6C*F0lL%]0<-N)n*`<3TN6J̰x- ޞS3HZs]hR3HZD1$4Ԍ[Aoqv.t.*Md$1W7֑gꀙɱt;޳h_ L]|}]&ʈڷ& !UȊTDJYD]1ZלOk{Z:JY,o.u5,BKLq:"g͋6zx,ޢJ[*V`8>!t4L)ȒgGš=avi[y=PP=vT/;N5+"&@mp=_Dv9;`Krh~wl +Y0ޞEUdJz:aG$aP 90]0g8䦲F @rjȫ,rtq e:W MNEխïJӸ2]Oł^iCI{(K*} 慁^'=-Ço(8p RQ"!#B F#F7ϖN/Rީn+GB .m**RDB@j U=Z$-Mjhd__qZu ^?^ó]U?ւ_7Wo$7URPzUῺSo@A DqsCfW4˫=oĮ >o'vTD+e1<~;Bo=2S9˩JAjz7I9<޼۟|e5s.Ƶ}:6E1mFRC \QRd2=4 );#\S2yȴ .bF"5ifPRHI\u ~I8vu}{jetnfQ /3iuG\%V5Q-]c\1;qϵ=:5?_$[jĬړN~Cl+K ?DzSɇs*&;ʂήXX n] =AB:>ȅWz_FDCv. z~LT+wQ0֙kuauk=Dg꽩%1iָ]ڽQwVTNthOs9sQ`qj-Pt|GNU[W 2Q)a$]ⲨEYqS" WUU] {8E 9z݄\yB> knW{wiNH1}1vɒuS^Qrk[Ky3u[K|WoL;PL`Q .rXCK(dg&etp;"4'נFHz>RB-wFX%gK3<ͣg6$s.F T; \jA!Q*n< c;7GH9+QCfLr}jcOq6[H"Rtںsf*ve vou}.󞅩D) {^θKzQюJHEx"abU3SkR" Ry6@y*^]3-D"XHkh)&X1LT1˞Huwa ʗ筭Ye=IRI E;Ӹ$Nu)3b`2r#nσumx9Q钁b}K[WR&4bց霡d f~iZ+T,RX$)fdY+Ta2o_W:5A3i )^H2L6z \f!Z/5D_,+d{F)`ӻˤ Fu^ƠZ{H&2 %״ E!$4^2W\zd@ֻ>j-fl J76JyxcVhaLm{ 8M=Ż6` jub:v@a0Y52`Bs}iШC3Բ 3G$q`<6%yXUa 4-St/Rp)V˺@7P.r$ :ِ> a9B|Ɗ3̇Ӓ\X.w q_YkoIK X[͊*{=l̋eΪtPCli;q0gYg^ bxXImx-u}NomݕGy&܏B0u&3am"j%F$BL|GC=V6}(A%<|IϪpRvTU̡+gѥ_\'%2iOl2iY˪;:%--pӴ8ΚsbA!NIhrK_((iZYm8oJX}w=wݹ0=*7FF/ άS7ޏUa6ޟ ΣSq¶3 ٣j>fI| ߷_q$~\?/blc̦5ȕMp=u`4d4hgL$dᕝ2$0 oo(<43T n[718ٙҾ=KS\̊HȾaQk}IG0úS'H냬ՌɓAͨXVt |vkF t-$SN^ j,QOM0ό!< E;ׁDt!Hc$]2OMp\o?CB'g:(;%$;;X.E %gY_Fkqp^|-TfDhŴ4p\tv?=ys?yON͙αwcg8 0d6>A`{f !a԰I; nMJ2j^- $G^}WU!xI)I0Ng{jm\h5Sw0Zty?ϷR @>[9Bw$Ӛ)nr青`Ǿ9uI0S h",xpZFR雜פ>|x ֯-Q#%ŭ[Էn+Pa#!eOݭԶ*n*ZۮCvMD|+x{'j6;QYԎphJRruG5hn"D#ŻL<)2$'8"*,`LlnR?GIstoHy!87>3ؠ&ؤ`sjoeQQ_x׮k8ۗq &'jX5}gdԳ,2B)oEG j!61ө_mTV 2$X`5qQP|(˜$'Im7GNz5Y+u;5ںƯ1R%k{O-Q-Վ$Ge$ a>#ߪ}Ƃ>,(Fyk] 8Y﷣la'*_\~~[)XfΆ,-rsko2#x:IgB"$HHղZ,ZzC`W'_E^W @wۈkZS@Tނ@!ppG")48+0k}aaHa@(/NJ||2/$?&~p$W(]<(g|K&O {b:3o$=*@0Fq$㣪v|0̦'og$I2W UG#EEhG<5Ĭpl ḊхKL&I%k`?~1S{qm?dWV тgьaARpJf#JUO,Q1ڙy9fY^$WC<T8 K1yǸ&z/ u7Sxc> ZǣE9ޔ굏Xmu+kGq83o\Sh%g1]>FS 7֥=v[Pq )G2Q?CϦ')bfxc]z}ƞ3;G,*7"$Hog7?U U↏=?- i6 K0Rm3st=,%RaeWQE;Vjү@; ޫ-A,t0̛<\=l7"oXMvU˩F=TԒ((^ Tx.hOՇbEŮr08L g3|g.蝖'ʙ l<۰Xlpj%w1^g`c6 m cݽ1하 rO2HrQ:xKuD`-N28uڢPmLVl{ FG~&y5TZ鮬Fijih`xnuOߕkh\;?.8)Ȳq@?_-5\J+V͠+aAci`8\b]/{gVp%`cxK>Hai-*~>rD89\cQ x!Xc rd^cac?ڧUǂ 8m|jp-_jW<'IzigyWm\Ŝږ]䯹(Oc)$%.v˾YgZUut}` !@i.mJ{ahli&EƬӶVӶkFьU!m]3bZdqYb & i9 iCQ abyt)hX3vJ3vB34ݷwX-kFR(0_6Q(#ƚѶVѶGS n~0`^ˠG&ַ+=vlUl1CQ.S;xyj'.d0fS#-pG%_޾}Iܰ dU}L%X 0rD!Ħ Q5vͳ<+jbu0vZ˅jѫoIf0,"TCOf@BC +t j6DNz:Fks=?Wɷ?A6U"CVz@3k'{WOsޗk35QTjFLKI"|ZA@RAgiKĂO{=x͊TkF\kf>F%Xcx>uB 4H­;M2Z8ޚ)؎f#~; :"pc/EtD宮*S6wmmΛρNph=Za5$tE; {-jv հ2ұSoMc6F?{_ sr&Vhk7Bbv%_E.߳v/ 6=<5>B'I֨}L;u"/Gv8#4x5SwX=sRlo4΀\άDq H+a(q k80 :TFs%Wh_ RYRKFVIy@#|e\/@߳ry 84+[-<#6Z$khb5@^|k @p`2@HxG Z\/ˁk@FtԖ#4`Ō/)Ac=%)㰐ƚTpI wJzr2T0'a cojUy--(TZ ԩzcfTvf0iRtj"X:9@Qa-bL `Hڲv)3Eѓ|y bF@+W/Ǐ3xPn ;5AŒĒ#Z`A886;D)UPRvtu0+Jf{Sby- e{wkIj; 0ҦE,MGq O02G by6^p UgLj m$'`K5bIp wowoiepBEnS T2)LJl\ܿ5B9eAhLYn۴]Z#En=gi"?0֔eq(6Ԯ,^?)1R$&HΩЄiq1A$v$FUPL@/F/>ԝ*^$(!B):xAݭ;Ҟ(C];qw IB*! h0ްae/;Y #-ܡS`\Ū;/kADdC*T`Zɀ1Gq)'W?X| f(W5K"Eh2X-Z @/TaƴػKׂmp(M09Me3"^+mAqbNSӸ+Gih-;ԧC$71od)M 8G|)PB9-c!B*bV8KcF4PCHP!&6a7}/ RSQ6D ,U64Xvs3s=q?ظR T2ٻ޶$WDr{ 6ؙ\0f"KQ'&%SedY^]Տ*.s&K3ǰLSW 4 Jr)7t2V!$$a* QH:ve4IL *&ab))$G:$Sck!$xH2"{0l;,dM$I8kBRȌʔ|GmԫH4Xo#"]쳳Ԁv"Vd21LQB,`㘄PfWu XOfl  q1}mX&>SZlc}UI$B4G8ˈ8 :3Lj4IY J,c|V2v)+ #=2[0D|r&'OQGn_BJ2;DO&_y&B%b׶|sad(Ĺ9#)q̬&)*I uH4RE=-/?V{ʟfˇϗ)XK2pS>cȗri.~Jm %B'Ŀ@CrC#| /of 3%_ѽ# 'h^m 16^O>p7qĤTI=mJ='<3Qs|CfV*"GQ2JUߔr&;kLMI GBLR1,IO/09Tbdi3ب\Lⵦؗ6tҤ LK7_M@ǰ|gbX8g[i5J?S+l,;1ى"SΙ1-Syq F02P vRgTp`j);퐨v)N[!,I36R gs ˀvܟe @B͝H%P<NisKVF MR!aNTbӖRپfLne1: \ZBd"xhe&b5ݸ?o\6$=fe^F3ˠ#HpG$ )&$ ~DŽԹVz5=wf<6="QJ#5ިKz ,((5KJjG B2 s|,=Bl(|W`7▎O#{-~؈X8I}{$M%vf2Д4IQBP iIboh@*v GI,3L!t>eX8! M$%FDnEK uIB:%ԒTOe|zSGw`fpQ}Ek͈eoϴQ aZ!};߇ .jZPcqGq{w L%xG5;>$x[0:d(sAUr_aXݮHT:}INs0SXkڛ 4Aː40uFhMHī* v EB 1b{1X1XC}O !FaTVGP_ -%gx1ݦ0LR= sb3fo7zF_PK;PDQ'ʎh83_PFFL Na(V;ݢ^wuLTd!\%5PVoWeʿCP+JTVR/ V=J.!g[DڌߗwN5GŅY\X3/3m?t=<[*W&hMQ=Q/!r=ԘbI`;W2eCJ-_:fa L:f7hCG}QQg`Cp%ۃ!:}LJI4:zCB`EQoqmZ3=`}̖#"F\HH(=^35[a* #NgSźɬ}x+[Ӥs١oB_wofS3\NGE>er~n>Fh銗\ݦ $op,DR %i:NA}T!4ı 7KI(>EbT0q|R$KCb~kzt\w<ʨ- FChS?7o2Dyd|HeϺ`\PtMrx{߆ЏV `1rfij)ݽZ z)F~|<%ϣa5xHP> +oY_:Tm~mDMi"x=%c^(l5ǿ''R_.T.Jͤs7CW^{EU<p%ߞҮA 9M8G0!=dJ$ﷀ*Q˯abCsw~mU.n܌6eNwE b~c=x:_AߖO: zO۠ 0)XƳ,Nf'dt.yq"V+%j.0K4DٌbKʼnズ[jS# 6@}640ap}!()ά޴Y],]3QF0%!GiXv?'Mu2_!0*VR ysX!OQ_PR`jδ4n~eR ci/hr0Ue'l1洍< #oګr&H{_HZN7?,%zauiB+YZHq V1<,D tTa 2i~:*0'K=0>[|ӱ8Am5ǜ &c| r?RM읯zSΫߒ;߆+0Hyl`a Lܼs0+3-tr}W|6qvA(חbA(VzڇW,G3jfSwKA6 =Qj lʊa;S_R g?g4ip:j*Qi_NlB { ?T:_C&O.Z)gV <@/$郤E1P!tT~sYSL-W$J̿gŃ_mgM|>;;l+q~?24CSxƚi%#| gpKG :,:*B'ȘIɴR7%!WUͽm,X-Ko@/lmeټ6 0&wq.Ogv_!:!\MiSŰ0>N'N%ڏD42,aWSmpPx\X/}r㋵ 奼 ;h-5>~6Rxu5ˆu}|K؄vU.fyifd8k[@61EEn͎owv4o=YNI*On>Mݧ*2PR*(TB6k.yh 4SH%IԳMs2*>& {g=;m[uzfPf$00N'DHD2F4QNJd0\#)礂QQd4ݳ NM= zWfAPgTTGl~V}"a;2MÎIˆDUX^HVD%>Zm]-VfV*_ HMDq&1 3-[P=ǖ%I*YGq *˱1Qqw>fzفK|9;>_jX(41^'S3)vF3kWwmH40|1.X$s{2އA0`XYJ~Ԓ6d%3`_XY,(<Ƽ_ǟrSOC}}/3b/..AѬxA9ɥQ]@ &x;1#snUy}fI( 鐪IZx ;e0”5 "0Tk8a<ͦ S:r,LzHBHL ؜̉ngė:|iDUEQ ku~_קb<<=dȗ c=4Hn-Ӛy؍%˃xgqWhȉ:oymp &DHYoWqo,YYc(Eto%X9ϲrGKm}k+|}G-'7`uE4HMm79eG\M&+h[/_?1?]9]LEZBͳkoj6jNm Pb.@_aVH""{ Pm3T30s0zS;LKbô (Ңշ]=!\2۬v:vjn(s8 mz-U0p+x豳F Ĉ0 rÿ3(J) Ѓ =Vo瓾(d1p %`}5]Dy &R0cb~cPPQOZis6TNJ,-ӌG+?U<|6#1/%Z3K)UFf<¯=|HԃgAךNkNS؆S+f/69!\׃??!"!\C?C?{$QbCb^u`Akăe(g"I0UW($>}|ɷ %\{f1bT(.R[< LPc8<|)Wrg$Z3-Qnsp᜛_~r]]|/%PƄuDžyt*D G#9;]A\TIYovŗ ~Γ]4 :E:2Zt_ f8QqvvvvvvH\vrǴu????????Icccccj>\ BΉ9i L{s;,"Zed|}eڐ=b00^6P)KD)V\.E\aB~VS4iT_foMf0%+lemf󷲦<635ͪ?fv;M+r do޿˪n`2]n4|>i\^L5Ns3j}>:=}xu YfthA,pXUW̪(!+Y}~X_{㉏dp^;rƂ.eQr؋Oh8qf9=|-~1 rg9n07#([wր)ITl~ZgOL;^&EWφU l#eF"L F=͆0jh0|&_^uwB:1޿"4pϻ8_*鱼XS0/b'%bN*֜*VT1_uwBϡSnMc'<ѴRݜj+/QQc Rٸh̫lZ'G#W*2KsF1s<:I<0UiwYa~w.*/We6t>\fv6'\w P-0}bkxh DWbq&QL=+'le-dx7n6K|S ~ ڢFx8vaROs7/wE(K*sɶe>t珠b~+Zh/W߆a2X]Σhÿ^1hX)% z.ήFpi giSZ08Q&D}-a#("A@}},2~Ϫpڻ`#غ [ӂTiOuck̈́S-i %Lov설AL 3/F>IE+b" dQ|U'abVѶOp* %?ʳx{T" çǝls0|̪ p}3{'BxQ(@\Yrʋk A~_)Y{{L*Zv-r#?*\7R` gHa'QCqX&ܣ YҾ4&ϢMg466XF!s^$I, eƢGٜҺ4l`l)kb]28Ɖ'H`mEZZ)\CUCi]`۲PQ's1rJ0)LN#d7en^g)Zhj eҞamU"w30 q)XKs.+˵9+u+qLALO*I.*@1 FF( NGRg"w{9ui]l.NS $;+>GbX#Mi`jyG*̎eӭ/ΕRJUO\@xУ1HBf Dޑ͆P 5R,))uq [p0x*8)=4xFޕp|72ʬl-" K08~`)8Xc0/^BK"fp:V[丳C!#e2g).뽰BsFXNx9+E >4b'-cpTͱcysUԢs:/>p `Re wMN"d͘[dz n s`:@4pb;]l]$4jn]$wej .8ǧ0G1`'olz4&;tv&՘WIeyRs6c4Wٻh}Yl(;ImܛScuD\ة\KZXD> }4#$ rJ{{\Np3q&4sx dQjXZsm7+8Z2 dHkq$*G!);LI)=}IIG-c zx SVj$!џ+w|8Y%Oʇ./@ HsS|T ) ծ+"icEw\tKv9<$M ]" JrPU8 _PhKh|I]y լ*ӵX+q{.!j JQpr }4`yGK"Np76%^mߍ==.xC4c<:/G1b{[m$= ށI'盹{x̡1>2;ZfVx-6`yO!W Wo.#^ 9*(D0Co$@>MiX&i'eI^3 xܑ5iJ 4mp7+w994>_C[i0*ĥDh\ʁ=1gyJ wf9>RkCD[?o7@Mp[]diqI[ܰㅱZt'q[FNq֊`2eIP($8) x}.<@BH"Us8%:Ғ,bMqtXm}4Rz_,DF:ޡ)R[#/%_J!n/q>%;",&n3[ŗ2B|Iĭ^mm/ >ZZeB>_ #.Wb q([q|[*9-=*92BIĩ 4^J']r.Ŕ1RL"N O1@b`'5.uZN7_d c,Y q'"k~H~*lKMcmu jZqk>hSIH[mՒBXIħբ>ܽEf -7RSd5ItdVs#0҈JKHl8^,"=%yI\x @! =Nz2ƢUB:WIN45h,q_-d&1q4c͙"w =F3:+E[UBfUyWِ0 4fq\ˈh-OC"F$qNJg C+yҲ4v#} :^/rk~HcaKST5OkL.E9q]k ahs O袩I]} J9ӽ!:8Xd QQl 6XvG=BZB iIY.Xr6WntEhv|PCål_;Xxpa1N>\n8WoG3I~Αג1>_?mq2'9 gV>wjC|X(GfyWÙv ky1xs1>_w$7eEUyG+Lݠ|bG>Fw7߻Hh+iIYP8(b4^drlnTY[X=٥$1i!А!*0Se얒 MȻr?ޢw. hLyL%R\H\[XX)YI.XMs6 q)sa‰ES6wDȻr1+l>s6O%} &ǫȤO-cp$eV'wg>-Vm3|&xmͷxxIHAcZ0ù!Y`(P G$wd9g"ﶩ .@3 -\;2c=JtBޕpںIOqqsFm߅d9K{Z_1L#z[G_1)g~`6}C }Knik,K>}A{PDEln^cb]լ*8l"˸6. h~G95kψXtpD*ƕTIx+/0⡓cC>31.mihXX֝*=9ߌSΊ`@#0+; 1v;9kcPM->C("r|y)P5>|w1% PhYi$ WRz9+ߛ'y"+] hlA}(}PKXp=21o(3AU/eFh875NfcG}#R5Tq%6p("^C D!2<")OFoFh~TֺtokY :8{x{t{A/VH+i,!tQ8tWk#ϝ]cNJVI / =s5k`*9\i +e1O@+c^[[md9xq~y iEnCx;]ܔTK䑠5MM(򱺘pRGjYǁF-Z[z*|s6]1{moQ= x¶b"hˢG MRڎ&!OH(qHm9)u Qm<& eKBa Dn&/ qQ@X'"Te)B׀u\D-p-`W(Լ.@x8&cփ7ﺸ<޴(rr8~Yo!-QkNPWaU$JZS0Wr%I{|`|E>RbD wߨr%we.&o.R[pV*>XݨGA=IYZl goBIJ7;;/({9|q;~| JsUjB5Wszknst̄8#0QQ|9ArUdFWp`9#I`u'+?h k Q#E,el*(؋.qg^nWͶ?Ch `Bls.YKMw31qyrϨUtYla-8"޿xAqtUEBK+e䤇Gsv"h("4Đbe fgTB# PfEnL"|cCaVo"PBVaCgMWfF](>uז z5I|Ep;bTwF/*r.<ҩgE/efɭ1\';;M`vv~vDgyvr>+ïQ"{;.E\FktG(_Q1V׸FHQT*>֛նnW'D1]ԦM9P֩FqGݸ:k:b쇸{hT[.2}6{I(_QqoXme0= q9ކ~>!i+Y?R߈o q|U ]oG<8,FQPywrzQ F(bFwqJ ֏pT8⊲"`ѫ2;!?Q5hE5Q%JR~<:=owѨqQZ㍾{Co܃UJt-cJ[t{`[/p6[ʒ`>jE㷈!P*Զ!?$s q6{#]5m$.;P1Qw]녒kZ a:kXc0Yae޺2xfcp\ؒj;$h"}CY1=Ð.Fx~0#cd=ˍ0گf1HQ/cxT.G(VL8vFWvw/ܚt[(%~w:,zxdRDssp7ǩr U6< RC%%0#/be.{{Q#k5`;wf16.G3wR1%8U^:d,}^CT<BaT^/j_s欑0;]-u7a!rj!kOCԣ0s*WcEZf.U,]PERxC>Rt 2VbK~`}Z,{ۨU1Ej-qΤP'.뻫Rjk^KX=cFQ#EX:$ߞ7f`me%W\-Z1m LyDY%rXO~i|sfZ;*;offD,ֺM.\;")Z0Xi2`eV“@r FXz_1S[{,b,|'L攔55{KgWѪpncϵ0|hy\o} rg1ϸ)ʲ1A[bBK p6.K@x ܑn@,6(Ч[^ҡU1i;!R)^JSTxS8#`'rNrITSOM"ƉIw|J/ )J$/62Ke))BD(B5 oD`[,Vj5@8`;iE,+WDhTkTQcEY+oY4+ :plϷϑ9ى=^aohUum`q%^ ך ' YhnkoQ#E&#zQ؉Wz1ϟx(V2~gzxdlPo~6aI*>͠:BQ J:ף;=ºPX%cx0E |'8߃렟sL|g@&X~:fy k Ƿpj^?ߞ-jxQb:jx1ϵm|h,Tq"-1K<d؟] 7]Ck &&lk^bDT |e:|%0pV=<O؄x2=η2aR>6σd^n1KWp58҅ eoϣiCnC6yW^.ozBWq:~5 y@]j#1*4*E9ާ}}|~}+V: ;[$,<?,ֳ߼Vzw*o5U\7YY-'ot9,l:r6 Hu2Q`96G/bы" QBvEԀ~&_,8˦" hY!1T93#!X䬈CoK|'C\!|PJrބd8D55Vd 3{ 2e.Yd7f@؏*!qE!.]qox8FK8"w24x,Հ4+-s W`EU0lj1%ӞG@NNJܩރ Pcm80b>,+!ǔuqeN<|kgYi=g>{UK՛H-nɞGi#h{("WRÈ"Y``;c%hFPj,ehL b)AuӶOw圧+}7s;#_v2-=,q&vBېTsq͌iN'HΗTP2._:c@P+%EdX _3@pD3 _td pX}zE8l;y[}|ki+z?բXfՆt.4%Tn>56nJ2o)򲀣 $`O >pB+o dr9h@ꩥ( :jz'-.UzW_w׍ȟ7_<[dgJR8%Ћ4Wpgck`VQZLVPpUp⊇B` rm/|fʶM$v2*:.cO1r=GSFҤ!E?ۏhee@ضFL2Ki5DZB+”.YYt/VzH F(>Ev5u6Rj*n6Fo ڟs s;h}ۥĭei7 A8)+zIW0[`q;"vtsn^mw{mb5Dcm5arjl<9^Q}Z<kkdwjC DSpYjVZï%OݯqMTD 'B} UdtM5ܵ3Ms0Cܧ6 *PGxV' [Z<{OrO1}żTz+B͒8w)}vb~5YjXۍ`LPĽ!}?A:>D͜ q-96':%ѦRLp[j罞0+6zB;e8U.DokYzB F(bܽ=$W"aj<5kï(kED'gQ~B~bP/))c CܧWH~rgY*SvX`\e*ͱWc-ə{x1jNaNI .) n*Xu1d_h/b-bq^㢄Qݺ֘j?;lE8!C-WPmcÿ,Vu+/xm z\Q*'RGZ?rY(1,?(J_֠翬V/}|V'+)0&[<\x#j$AVc<+b\aQ+Ѱ)`zly`eKd-syHgP`a߈S@>&@~x^gZ" s,g6ԕ5:jnB]L!NIYi錗bYU ׷ҲĥqKr{*JYim PY[Z*is䛺e,Le\ʄძ9*l|>`IWQ"mC|YNUYp) q0K37(o`]T9 1bs?qZLX^x홽n %P^{vQ'cxZ.͡N{l#r|[lV 1NZSicDŽkc8,]i+BEN%qFIRغ$:."=<Ȼqy) >kE%xm'9dY%c["tv1~Ym8exgsJ\-/y^h0Nv|N/~g+Xj}~@Ŗ#KL̥.B#@dRZo p~sһ=؅rQu[@0A=E8K E >?ʵ9oXYX7@˅me"N,x=i"qZXͼZl ̽Eh2%!NjP"HA?'U½?FyAԮf/D$yKu0Y\ *4) KzLR3q:erq=ɱQk!{l8`oi\feD5GE!(9.@K [ojbJuYan)v5dܽu "[ zmxi1\f(I(Nkg"fe`py79>Q/ /6tx]}"(47 r2qlLU"Txf%4>)Bie? O˼(rB3%3S_gwJKz&kPJrQ uΘ}A&p8r*8tyq MO0vqdm, I=|?Ӄ0 B9%eW?f TLc)D#ូ(O8bppcn4"׫EiL>;t Gx̼8LΑ[۪>{[WnHPYoӳ*$W_v2xTWa`o+^ȹ7>N[#нQ۷7ִn>$U?f~._nQmW+x =9 (jzU{.'~W fkw?r*¥.I?f9$%LP͉΀[xlzB,6V([X ģI& 7En ޽EG%5L=b4<",TPYbuh:BBVʑR0ՙ"l:A"mB|RKl~n U Hɰ+_?ozYj MMzy=ꛮ" 9W,P]Ex(޽?S`gtTXDቲʔzB +%ae<)>V`=YS7-%NP=YO#T]˙!kUtt" t4hZ{0%6Uq8FwK1_Tr[G?m4!SHqjc UHbCJ:Jw7_*Ə\h'8{G]{t_-1-LPYf<y5u,i8y9DA=쐿6?𔖊=18FhiԠ{P!"Mb1앰 Yp)jTD D$ q8jSSŅ] q[HS+#d5Ҋ{obK+^!kى̉S@ӗ ŗ1sU(@\VQ]:*5&gfjM(jHiUf Wx+UQGwO#DIJwE^lۿQ i[8qHR*,CQ$TUAhZԥ{_FzC1è2#LzOt1s.%{)L,~Jʄ4WXgpGJE~!;Ri J].ji_zgM;: vtc W'⥌> | >5u4\ʱ`=ƪqק؟B TE HEq%()TjRKD|RB) HjÐ73dKS"✰_L|/$^$m~VE8<$W2<Hboc'HBkPpVK𱫩Ǐ F-?s;tit;Hr"eLkt"/㥥bNi)ByYP"EN mFmOW%TR lht/"5dǡG;qɰÇFK&qDTsNyAm6{\nm wl0`Fma'd^syn}m{c#L?'Y?/Vv]ڭvN+4X֤~_=c=KypuVJO*tbH|Q?'jC\/:d[ˌ~fd/u#z@z>}j4"<NRGL ʐ9ɕ)1b,­ rQl-r)l!CLhc#9+t|~E=]K|NXP 㹐urR;[ I?N d Lʤ'e岚Z-/ oF): 7ɕ sތ (QCMa(N# CLt3@փ7Fnꓡ7:|3<];}Kʈ}QX s{™RșL2wFsE==Qkk/18l]M#EL+}<;١XQTAmH!EMQ veAmȢ0[ _rA<]%KvҩP), ]Q VfouBaSn6e2b37[#`ܶ#xLd-ДAZ>, onMeb|NUĴtʓp"y8 ˗- y.\Ґz#4"YVCWԞst!Q6[Ȑ2" F_4ryCT; B%"CTHcH%ar~"Gowॷlz]3 מI1T4HTʈ.Z!oQ DٕP$ g} ]YS7k% !JF3zQ_x1i_a/0#:9A!#XSGF#kv.,2eHw [8u9#;(q Q#< c=!~U!^OqnggsymG*+ Ki¨kQW\_x xJe2$ö4az'ZAe 3 Pr 6 jiZz .i`hBѯ8!C~w31dQ^uIX(-ӟE<^>}5yN !b2exp4kdYI3K~ٲEPUv-r96DDsh.x4;4NLLffK 5R+Ps/t8-z*#tMH@Bޚz ޏ<{Gtg3aTXOA`St.?{}~YsDkƍ ׺Q!PH"Qͦ}vyA_ON#ؤ$H.vA|}5ui8c,BK5o"GVikK2G~(|Kn~B h=4Uڞ-)]j!gnx־ 8UJ>} C$RBJGLANko/aDii1vbR3(ϧ 8ClTo,Kĉ7\ƈo`$ 7Yunoira:)|yv&0 = aĆ)0J#bdAV 3lE$,޺iWߺ̫.4;Q4C,Υ χ< s _x_7Ase j'Ο n+hN>0 d܋5>,>a|[{leˇj,£7c=ٙCԅIn]N杓@X:eXk`"X}4F 9DyeV8pb*|!N^[K>+f)ػ ]aύUh:;1rGKY]bL!Ah2k q1)f0QC=kjLsl^CviګEq Jר֮<]~l :}sh]}B,#3ʀ`Q@>(=~eP(*.X=k!Zђ5'_Y"bdlQv`NWwz JOuP~-L\Edph[8Ba/1*_?.e/,VY`-XJlڍDSQ>Z,S|`-C$4,Z(cÃP#}T7zw5O9u)b #cp}TnjfTd9Vuplr^,Ҡzzze@dy}a#-}YYRsX$%P؃A¦<&@JNLy#3H4ްG hB…fX NHS28uraCWkàIMy8eP ;|D Z pmPVpA>8Wt+b~T33ׂpl2ac1ӣ,f#R5,t8(Ԁ{2%9SeJş'VP8wN*Wk9>Oup XLջLm0.⃿qq28to+]V^t{\9:eup DqưG}0.AL4Ev+ {B]]hr.V284Q|A%^]7\|A&GWqZj$>l2Ln6̒SV,P7):;JX\1yze@tyͯ'r"AQ 8WN 0}]Yqҏ IWJֽa7JiKZ܀=©GET[h:472zͺHqKaPzxv~34S_{9qqF| S{*cڰ/#!F㼇''>*C96ZlgYN@̂&Qu{ G蔠Y%ũNt {n128V5O%S]&f~34+ _2B/]x% Q] YŭORΣ&Y+)ap\^PV\X;Y=}T},7Qat5șb+C:vnup Rz4v6,zn2. #{q$)&_ƍb:n(hp'\~ &Mb.] Qn[W;=УWʝu- j۽+N=4\ ^{(DǍPtЌ690XpÔ*} (l{mfUaMgl2vW6pŧc˺n*T`__87Y[7_Y{xpLZ4$ fHclZ>E7_2Ndx_ +xlt f8'9;;.HˍtZ%Q. YTÑ~C28|^o۫|pRXkOpF 8lk5}TGOʿ8+>ꂣ+۝qA)"cWDžbI0,2` W~H'P H]M)r܄[ҺhZ&{u+uFC\n#(>*U%R%vIy[G9Q=lYtQI1mm/6e^p]f2O&~Ue MpLʀH$Pm[}/QpD!ڀ BQ|RW9D&0uO.*4KuxeD *lte㎼W=h 51H7yxE cZQP)"Zp"$.˵ަ5=cyܳ>*SCШC{=+Ņ }Y`~>IHyws{ gJ6T:Q(NdO)p G.>$ybul4u`ҀVbh+h1KAcO0Xpqg!<s\NȲ&u9ARRee@?"/f^&bc\d"TlxR@ABIF<vYY-j D#asZ4Fj"&F CeTz΍{N{3Ocdc.j@ W_%PG?]UFB GHI#U}v 2V3 >ސ}T@~YJv1&?f8Hx"GĬDx{ Nk9cCxve0 2JA`m`)i7 j2rLb=Ȫ<4 3x𴰚jJLM ؘt5 gUHvEP2|s]pGMWFL.֕TxeqUSK4$: Ns/0ZX?Fz;pș#{59KxkiˑMGix^l_t>6.싇,s2Jί(,O9Auxɧ{=KMxLYC4Lu* .N=#пvC=ڣfGPcOpF]NU;F|Y~9۷ CZJ[j6XTʨ"C@Vb2xm\d,؄ FH:Y8 яLXOxz5JN3Ԏ.G.% L~`i˲s;~;?zN۝BmM&ayo}(Nj.h܎ ;#/9Dto7_U,—_2J*& D>nIR]ֺTw 4p yM܋j b*=!Mtu'7Wie*͕'z#0:S`DoSav" z04rԾE^t4.n`=~^=[]46or/?ϖ+ @k Y˃v%.?4Zi7kN GEX-]8_iG@mz _M_~"L?O?|Z485:4}l #ɓf-2Gpfy LqҍV7pb#Vu`M~'~g]l$[GwTG~9 w/.nb2ERH\>xTq3B )WWUijݜ܅ψ/<@|F %_'p5CnB>,usZ(A8B Pʼnp >k/L|N\׉id7`!~hTRTh Q\- ^.Z׉ily,3%9о}2^<:ѥ4+\+ >ƭ}ˮx7o:MefAޮRG`5$D$cGejy"8`W{81̆pOo&_^ihj k }M^/n_f; E'OShLƟ/?"Q72tU!ǭ_?f^j+~ew?i--v{ G6)a"1H"eq\kZd3.+Ȧ7SoEDiC͛[ 5׆pXh0\0A҄H@1șeKƝu/C홇E>쬍$C7/Q 7e _57Tx`\&F}'ާɯi硽qҾ : t쒓-E_w~ tβ=}J'O_n~˧J⦡M}s9U:D`R*1*K6k.yTy`-J-2(̹8wj`Y"Fô <"B:M<1[! Tr :6Œkae)=A|fP4TJFX'ӖHdpWAJpDخ*v@Ѵ0N&u|/fB]}m;R gZ=&DI, A>ʨKW֘dEMJt@P?i,pWYkA!ϼF+tDlD̈́* SѠ^H "V USU-t4W6XeݞVa5{f?߼\ y ȿ/).?^~Ld}ikmH !\ZÀq؋sv7A"g,Dlo~Ç(j$Kndr؜GW5U_E f41/O{N_%}G j-0e|TbC^ez_95WW2]^]R_vLmy1|[7ORֳVOmc0?#9LM@:QQPUeTU=2a=K@\]#k|H՛O^Mwv#0(X8 Fj[KFaf2(;,|5Ij %l'eҙ\P$>5+Bڵ%T#PCV~pLj$!Յ @$09h~#YaLja(LFD , cg`Z}Kzd lT3/8WWc,F G|5'JW#JQuL1:3d {Fr;N^u__wEB{WfJ].,&.lP?9{5MY.e^~N،+?f^DA5r"ʹ&c6DL,IE)^ >7"N\ܥ[iEĢm%rpPWl8˩09Zm\=$sFt_͏\?^$1{'elI c-!i76WNLPsrFOX喼\^Iݷ7ƫ/٭Yw^CGnb2]3!?d;[.'VBq5θ%0QlL ֍su 8drcPCx]JK8!#rXr.ӆa1׈#y郠¯-߻,,-ﶎjJLtn^j+mT #P0z>ٌ.gD_\)'͖w:KS %ZVJ0C6y~ҚGD-+ KI@hR/# %#ԀJUv(%!dyV2B ! $G.Ȅ)@JF! y]p&Hp8͂0JF"}L!`5r"Q\{J K#%#T^\-i8K&zL, 9[DdhI4(z)+ % H䈕 ƠƹKjdKF $?\-xp*ퟮB!&Y$ǢspnIߢ^FaFak/svA ICdLb҇UL KNn8P\0*6ag4YNoS# v@h).?;繉YoaS 5E+b^D×h!y,Y.;..q?wTiYc#֚ =szOkM(Ԡ eZsGJcbی#jx`;eVv+)DѿvS VTy܂ݪ>=7)VNC]vﴛ|~=[g|dfeAwUݵվC#T\~?:_GE]ͥs{" nDeiiʼF:bl%C͖7;ٴiAqlnVcF2|I| y!>ep=drw؞㽟)푌ElQlO'o9~Fm*dx԰Tssߝ*Of/_W;Msٺő͇H4',I"1xIo[ VXo4!yq'M5b0wƉ/v xy=v n,d5ɫY_"\9Y xy<>aìܕ<` Ys,~L$o&D I,j2Z :i%RD4ZR"J;%i%Sj3IK"J*ڨdIG50F9Ҵ)F)'LQAs'rr&i#m]lX >Eb.R6mڔj_ޔ$+.#UÖ&ɔO=K`c=lw8INjВsZүHr죁#<ΔK#! 2X SJr& P68GY`@`{RM|i dw;:嬔ʉ!MSw6nɖ\Ը%dK=jܒ[qK~FGDJ<)|d >>с+V4!(K_=y n!nNwMnüg0g3k|fϬ5>g肯CH^k$/䥑4FH^n$/䥑4GpDeH^KH^K˴h$/䥑hģ2ouDDXLE1[p(x##G4&T$6bm[;ݐsokݟH{rRGƢqQ;T ! jAķzJ֡=d|l%z6f:M ,3/ |C{6lby;lޞ960|~k|Mo^vML2 >NI;%pPt #H IwqYTr"u;N.zYS: n,%ˍVDz Z-G2J2Xtv&P)z,N)ʨ4F촣Ra85TG f{\ctJՎ [*jSxκs 8f~TrQp P 7(˹[M=@F 80}ގ#.ONhNq0ڹ<1z"j 'A8`"29 :lr6Gű&>}eOz,]uoG%f3(Rc(LhXF^H#l%'TU0zr,S9|նȱW>||} a⬳KIQ ) $`+B&` yl;!?S9vȱCT>s _fř۩5\Kv8v/9&(Gj=.DN$$\\ MDՐ93&,ONBMD&$S#FkuN?,rl=ʤAq ~C|KhoUdzq Hkl[col#H(jtN7:F' 1fq5:vf"(:&qL8xr(9zjFķ3>\@9=2ͼ&$X?`0{;U0};BQĠ&zsCzPozPo/hqӑ wmH4L!dg-6-7>DYH7~+[r[,je$N$QMX_ɪY5U*b]#^!I>zӴͤ7F u?)zx,`SBPœQqL)<3ZH * AYPTt{t_iABZ` Qdκ@@20$*vR c09ءtK I,z/1űq_Sl%Qw:6.?Y[JyqO0F@.5 tNWDw nW||bk(Wj5l(Hp d?bQb rBa#@$F"g)(utN3t@g@0 C(X`-IVW#J@;' Y>hA#/ P_Ek΋ɪ,zNƗ4iFUw6vVLPT+$~ .L/\a4uQ5/^ߪO~9}8lQ0`N̷0TPU1 9!߂0|Zדk{b|uO'!ka,@h}~ &*&7焞^?Ϯ#dm]|z]7W0 R,z=01hңO".bTm*6*yT=?>.\;PG.?ͻǘ? ؽ^պ!P.4luCLvwMۢkۜuMjڜ~ؘ E][,@~{3oSV ݝiWj_O//+0liGQ$ªJ܉*D4] Gk>?]d]ڮۧHR1U@Qk0ZD굌{_F5i6ꉣ{/羽x@f}oE9rib/0( ]VveUYx♔̓O4tdh$ٟZu>H *j|P.6hY (;J+57$Y2F,еYkTCE&`^MpDg8C Rs|: Wk@Z/}ݥ.KK 6p01p?&K^,[L2Zdh VWNxrvN4L`xAC4(n.PɥbIY iOP }[PM^_s3a-$q7(fNKx(2Mȃ8bً@"p\V[y)@c#HX;:(,1GD "VYP5fڏu/ kM eAeT2  b*p#JouDkF{ъxT{Y3Zx3t^8!hLR14!/ Xr'&0K[{[:CBO">ֲl!P3p$Z´ ȧiW0ܧs&%@cўx,XF"qm,%AO*H6_z4)sn·d N$ς> S㻯ۻr5ၱXjT%(kSLʙs?lvnF#fy'(8wa \Hpb0,q`/' g=dqgx8וR#-4( m[l0٣[pu^!Uhe5x!}eu0EI%7(o+U=by|fmc&O!/Gs Ȣ`;dtwEbD h$T;]f'wtt9a&Cu)0ڗ{F{-@2GV8vRZKI4⨸h zWmܟ,./L+qChV5@P.ò#bo?o5Hu:6:S3vmt w;ůZ鬠*ahz =e=++N,w:Fc[$l[FҀF LaZg=KpQ.3Up'DhGJB8X|kc69/UxWKֱ1[}![%KYKVo.5~U+_P |9P}p7yMHE^ p @lR"FӝI;*lG *%6 Nu>٧*-p3mzrhlfUԒͫ>6VYN~`V /@<ܖnX=>z ߑߗ<4<堭}0lrJF3gR+R@F $>d% Aa߼&lѵ@Nǃ]V5ݮRUpU&Uc*XnZQ|^.[zt&d4(@,5 H=ыOmOxZ'5; fG7KJm$w:ُv}ֆZmJK`$e\(Zi+{=82K) zXCIMtN>TK|ޏF^pSY>A`~8x>вy3`$ŵ%LQV 8Ax2SceTzO7`g`/<ø*7Dqʪa6A}-ySݔvڤ?x2ӽꔞZ.PHx88P=d:FY^1lSE㍄į&$0ZC}w{ ͛N>zgں4bҍl[PystOl0m> V?7o^\./Saay!񫖃#}DOuX cy],NޥE|'0钥+*8"QZFT`V6bN|p3"ا-|_J?nȦ\X(ay91Z',ra V0qv$a$,sͰW)o mEOL mbH5UZς&S86Z=rFo$M=`Il+D} ^Jno$'C5m7򺳟i,"KzO7~DBΒ EĪHc7% R=s~Mm rJf51"TueЎx$u:-Nl7H\nn=T~RxƣX(#2b9.SqfU@ Bxl \JlB O1jv)]QRS%&zqyv9rxSE=b>w,S07RaRps6 81Q[KUT0K7@FY$\TxGDJNcRa+ @͸Jb,SMI$cbp7Pð_Bγh0x\u*uHn$b(02DipV, +yO@b`9U6=;4{t_iA ^۔xCZ`PmޣȜuFd`&x'HT> 촥F9#K c ءVxMu7g)DeWc«^cU?xS_Rc [/U׏URnW||bVk(xVj5l(H2_\9;*J B#LQb:f: 'vpޙ@t@g@0 C(X`-IVW#J@;2' */G ڿ|8 d%[e{=,ug'KaJp4;LuͭLPT+$~ v We[4uQ5O!FxCu޵q$ۿ2ݑ)N .vHH%V~V8-DRjÒ-qSU?_мp|=:d-y&saނۅ Q?^x [NHHcґX>E0uNI0 6Oq]F+kգN6jZ$+9uTGUS zsl6*Q&~X?~3o?y}pz)e'>h:6O᧍&7C/ɆCxӡA5ִUOnƻ\r˸}7ٞF>Wûg^ʹks^s'*L߂F55hfY)B) }1Ùińw]?pA#=$E@4$c ,D IkJ#3^D-#l;0yu>b|mrw;__ q6:ylJ,w!)4&%H:D w4H[͓IsɍTtr3]%ߙL<H/ "G]`F")Z%08C0x:H=tH2*̑Ċxax̉峻%F'ß*js,hKH*cӣ(ϰx_UHY3斸Yj𻓞1@@ 9u}2A60pĩLNήfQc*PIr)jpGHa"19 :lr6RR G'c::nڃMJOLAz)KA6 9-d%1(9%gFQEۺr^wnz+י8y)ÅVJJs5F! ͏_ lף͏jh D~6լ7z08Vjl03-l/(1Ni#^ e! U5Ī\9%4a9M6Uq-=,g^pя_5h7p&|{FWذɏWEIhR*^vl+q/xRGoD\ KHy9" gs3)`)w׭&2}M\FSRVpO>.3Eoo߷cq}fH|;E@uzn;,h(|Emw=ZpFsμo^3m8LPtPFic"u&R.f&%uIopakH%S MeaOՙ|jϺuāB0]u8j=!و qt y;pO~ܤT~6. 9=T1".NZptt8(EsЇ_*rKk0z*xu:QcV\Zj|y1M+Gg_,%0Ni&saނۅ Q?^x [NHHJ.چapJZYO!};bFc.z Qlmh:nW=Ց<@vpդ\c5HU6VyPȊ(?l"3o?y}pz)e'>(| 6O᧍&7C/ɆCxӡA5ִUOnƻ\r˸}7ٞF>Wûg^4۷9Ns;^\\Coi_#Y}{{VP`JEhLfiT|~[L0z|&t;&)"*d 0#E4iWIz~Z0P6Imڼ09;^ "0+9܅pӘ 4@L = o5OF'%7RrɡNL&^H}Y;Δ%_R`%v歑9/<_*Hw/-uxa?_-v4~4s~~gEg[֖VF'CZ)Ŵ1ZSA]/ u\,iKFG+HUwL43:e mhf~m[vkx8%fhPozjvxЮ9,3Q@YQjc9'vQI.EW mp⣉4QI qI`!RHU; AI0]],<|x Trs݀"Iv&ikt]*yz%n1 :: R<=:] k{%^/:DA ˽5jnu& ;HQB.V1.2ȅs@.[}8'ɹsNSRHĸ6q=m.g!nM.Oq D9DA;{f8%%̷%5-:sXH$:xk zd✉Rke9~]k[syh`>/i1Ʌd76͕E.2<JcDnu.%Nп$w \Y YloI\*[JgUm $gc)**j0B!I JND;[O⺻mXRhED8QJ (_fpR A edS9,ML`$2ҩY,(LBQ#E?T% Q)C@zkQ&5бu6[uZF, m˻CͼKh1 (.0kP+#ddH@y[TKPa,5%ѰSs)yN|W&1ЀXf^(^ju>NJb0U" qP#AR@O$V,X뼲HI1*ЙLB `=DDҸ$kP"@F^J#^r&m*U-AwE:*X5PYg&pY )~IV 8˕E@Z[ԣ;.ovE;:jX;k ^ŘB2ۭKzrK(nL= d;dA@B6)dB6)dB6)dD-dlRM ٤M ٤M ٤M ٤ ٤M ٤M ٤h[!I!I!I!Zd&M)ywz-Bٻ6$+tE"XxdOwжwvB^e%&)FV%b,m[D:#Uh %*Trt;JI{B.%QJ ))QQEFn D ~t8աar`_ CB;NxMS[j:-J\[*(Gznt: WN-SpTQBD.2RV lWiYHdž/p_ֵk?@2bwr8pG@W.n}z_W}z_MБT\+C\+r\+K^9o3qK0a\~G`RGTRJ-=@~Xd{O-ٷ/k\ n-`3,x=rd>vr_Άa3;ŬmĚhEnoii ^H v? SU_nT2J --% rjnSL:JrEEGd HIᰁ°MJ Yބpx1leWy'퉤NFS9K8YBdcPPH|7\#w40pp0;kHEԒbxЄɮe)߅Ư[HQQrSna$#K XfH#Hu|; I["d9̢*ĈP"֕A;-o\L{#7!0>!8cJ3%)4p2PEoJcgn&癎gO, |ŦRaRps΂6 81Q[KUT0:o̦uePoSHwD@4:&}rwZZp0AP4@2:v>ִtpzG`;knzhk La QK ؈' Wyʸ2D!`D @M-3ScQ zu[RfEjW֩̂`ڼG9LN|@iKrF(CQ-qmg)DeIVG³0.Yś#B`^ٷFKC*qiGhajxyXϡijS` `\ {=u"HuG SD:9RL]njFܞ}`;LϦW"ڃTEX8lBltHQ2u'֣$A7NX|}/ ПUsV LeD'e{>iEp|>FTbӠUQ5z ER × '>'W0* ژ&O)?V?O~>xq14<0[u|YS=T_\{o.Uo6g!qaK/n|Y3di3 ;PZz> YLxwm:ѣ6c,m[rY &YYt: >]z(X$q;KAwTL|_;W/{~vջL_Ov`ce]x.Q&sp_4ozNM6lڛ6Mkдmz2Մ9%Rc qvl˭ҏś/zewTDO U'?jW2=_W۠畧N{UvWa5gJܹL"AhfG*ń_mr#vl6:eZ "Re4#G2 OT7PONCG6>:9 aG NcS[= YŢVQ1xP;:{U5x"9y=pWx%!` /uz>rpۥ@ʉ qTύ !Z_a\?Xwx;@jp;J([ȋٸ/ݳ:N8x/FZי<;KIoCrLF^h11^A:t QkBXy%E:[IR>wiW6TRmh.L%z[J_y^$Emz&>͑)&E}'3F-8㷯D"h!}MTz4)~~ѵ=T[ m{Rspj s+[@ڝjjA%v+H3oKA#.>)! (?u\]2nԭV;Lf0ŲZkس΢ʚP L,<-yF:pk_5~Yr-^xj1VppX:W S߇b *ȓɂJFWl%2vfp^խık3j8N76z?S[o̠XGE+q|˧`KN× 9畎Lucg >l;FW=cnyhx̡ ῧߵV-jɻTO`P'3Q1H8漂J4AT\!_:"$.5D? 6z)^jbuޅԽ3[+!&4k!KZ#;'}_{cمk K!U[>¼'ZŶMb&MDc HLk[}0×?{<6|Mf|&N L@:~pg0U:Mt &ϝO/08[EsdpK`U6)P.iㆇz4qGҰ&9yU-7^&rIWiق5XoZB鷺ڮfW-.rՄqLx<=ؠo~}Qh{WN{)5X00Oѩ6.fcu QmI/OM@YLSmEh-ZC<]%cb A6`j6d5-K@pK17 kM&TS .6b}X=ӊ&hI+8GZjZ-E5ަj.V itH4D4tYE3@f`J9!4Xc@?&/ `sO?T4o QĵfYMxf]S1mSF^DM-`=կώp`Wq Oo..G}P^<8: 0#'-D!/ <=!5|UXs@@M$jDd$T?Р62 2]qN[Y"O͙͜9s6gٜ93gsl~w=<R)"9͙͜9s6gٜ93gs&̜͙F&s6g,9s6gٜ93gslΜ͙9s6gٜ93gslΜ͙9s6gٜ93gs_z%#x 6NfjDBA}񦅷?%Wf-v9i|!JNNȭ!uno{gPmz􀫝!=exI\(wp)xL8P=t:FY^1Tk%iknKYUejE5Vԝ'>svw*jWE/ Mm5ÑN m/vmQ~b[@x!rFE/aP% S mcuq_-zs&Cq6Z{㈌Giu$Sy` aV5bNRgXr{&CDfhp8%b"XZZ;JazckY}6q߯dk.LcЕbO^Dio?|ެ6i>mA2ıHIᰁ°MJ Yބ1ΈXY`TVVwޞ⟌r0#+\l ZV::PH|7B3xnVɀڕ (dF/mPqxdgep, Zj$p1)5Nj%b;1MDJX R2q V|[D0<׻Q C<>8<.Z9ft#tVo 0gH $CC"`hSV!̂GsL Ȣ`;@btEbvبD7>A5΁#HRBRw`X- =ś`>>TmOo C#9J $!j%#<3.7;2$qv՗{vt>ZMpD0_1H^r5Bk.XCJ'MyazBJAY&D4rfbjؙ/՝`-}.Zቀ'> AkrɡgEe!RF1*K6k.yT{+ RPcZ>5D:Iܲh^!eX}Q)mgn.D.jyyKkp'Q)hoEINه>ߗE|4s]zfǙ^p[lO=غ*RBAҬ<W/ޝ,߯<4^]j'.O#sڤ-'O|ׇ ,_קf+\&k^.=<=6+/=uUR*\򺭋ޭ_ZY=Zlavbӧź+3~: ߽{X}(\Y8_-Q2Um B3N|ҏo"tyٍ,L[U)8ZWv-bw{7Rƭ6d 2ٴ悏/ ێEXu^^_^&?bgf03權7IX‹߼X?7K NkCG9Kޟ}| ~w?6Ÿ8?I7мE{r[ڽ[čpz~;ܛv?փ|5,//Or{_)OnN=tPj; Z_r߇ cdJ{:]&^ZIgv(P`9Gv[+Rﯷ;_6yihD)^9qx)U9Zcwq=7B_7 N*U^f/gjMQUq֤[!Uą ++28H+zn묔LN׻]!VPs,o956{,lq7t5Y hlUn:[wG־hjf_?nzG6w5S: \5?m~\=}=Ze]ڦlҹ{HAb,yn*C6:玩BQ=&c. rk1z` ]Q~b1*BB$u4JȤ~mG\[V1D.gqxSgdɔ3OYBJ 7 Z蠚45 K?(TN+Kk(#3K &7}gtsÒ]N%;Yؒ+1oPcm "2&S&/Crs;Ý"vֺXl)6.C2+Tdls""c搃)uXudݠTGDMPKͺn'dF(фh'BX E)1ٽԀt@>+`-i氾lmFJʆY SnWϾP" bȳեce6b+|S"Wul CK=2bdr֊JP4 e͜8nZE\ .)úWᗪ$+fHQR C8i "1]9;3lz4ʖۦSD="?8ˊw_ئDŽHM7o(yi:@.[0UV%9D9GHh LE|XpLΣ&ic]Y ̀(dUB5Q,l58T4m,vy49peÀQxQ< chx.c¬HpK4-WwN52yn&k)^ I|hb&F pC*J]H_!1g]D e@w 0FZ%\ FKVJc[D 5@W@X\ 5NKbrbF(NȽeo;VɁh0@,G쥕0m,̚$fRO a0,ƬbiY&J؊NT-rEAnwet֑p64X#Y ݁ RԵGoU5作"Z&B6@ªlҰI5 %T|VK Ô<rU' F;y؀y?xnz},WiOULYP`.`;XNLqU01 kMg*١bq׮aXy1i F0ڕƴ#_LφKHzV:Tu5p7%<%*`KrXW9j>Tt#!B[wv֠D+R ;0dJʨI hH{@uoު'dI6!+TOϭ;lE]1("^SmS]xr3pO9ClJ1xyŴ*apH 1B 8Fr#%d$ vU/x:@9E*=V6*R T6c46[[Y4R'k֪TV"+P6M d*ַej)=ҵ9 i5k71Y}0!APMW Ũ:8F O(B5jUDae 3^teBj4S"qfd=8I= FVPO,eJ C\1dn z-7 Ҵ>˕sUj4&1_%T, f&ɥ"0by:ubٹF$TiP@]p]OtC@Fr+e0@56[d#~r7\llZ2$Ieb(Zu q-IʪBXV%ZKc6R7li]H]gg28~2 $3Ri()sQHLv(E%ah(h(h(h(h(h(h(h(h(h(h(h(h(h(h(h(h(h(h(h(h(h(h(^%5hRL 1׸gNV4@ߢ%5@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C 4@C *&<'%G $E>% i@r`@+h(h(h(h(h(h(h(h(h(h(h(h(h(h(h(h(h(h(h(h(h(h(h(Y%> )`G $~SvS>Hj(%GGlm؟7 IQ޸RZgޕqdٝ>WI& vfLNElb}ռ[ĖHZernv~U㬦+&pV:߆E=_[u﫟WA'cP3 bWgfpZgLRTVh}7k>[}43sNaz~N:}\.`]0ݲewTe2-fp{=% rQ*;f2.MeC~y^8uϵ\~eڋ'fn3 Vq@“&N(RPejXHF퐙S+tb9̢*ĈP"A;E / K㹔W1[C}ҷED'0K^f+jp.Z qƧR1",v ~0|T%E3APC HPX(pGFӍ(unC3>߀k 8Dp¹wޥ_o$JzZm&Tnˀu!C$tG'|^t^%szJ-y)w+0H*D'&jk p?paYܾK)ACb(`܋0 Z9CVoa݆wK YI*"՟$Q &ctf(ʦL6䣌(-V.т^8LFSH `܂2 RԴy;DZ`PmޣȜuFd`&x'H*KS3B4ܤbW v(U*hMqqWiW*22.01oJ;a99.,.UvTRˆ}9M <ɫЙg O$*;$bdJ:!8ic 0EdN.R"L݉)pmO (_%P0vb քTK Ո%3[Ͻ7|8( !r<&UJTwToUZ;u4iFUw/Swߚ'BQ=K6 V0j%;ksZnO[kUՅ:0@?o=TtCo];ѻξw^&{k8Hxg~݉g?o?tMLv[ Mbhۜ|w.֔ƽ%>`cVrf{nir+W!|lx=JGO:pZYY͐&^ ˆnש*q׾R!WK=Ҍ*"z~ӞpUv oNh]N#/I :5R:j (LHSC?bqTʤ&lU xP"PONs6,t^QϗC_$ogbPu un KgwE΢79{ aG Ncj藲[= YƗŢVQ1"rSt#6zLٝQ6g3Pm:zΔk^*8SYu6\cZ 1b"M}j$>cvm `V2Û!TMZNXi$7k͗꥾<[{)ݍ_^Qp `Rb 6p.| uNЬif?w3ʍrEEGd l pHs-OJ{Ojige6?ۍfY/-tLj'J ᰁ 71$š*-gA{8#b`xRYx[9zd|6[q!2hYCk;mo`0~2fؽA+8Q@3BCA ƙARL:8!tj[Ohk``8MH r~܏n*,PfL {ڛ:+ 0 u|>pd:.j4 Ej F>ea.a\_9- -AƒEiCvV#CB"\H<7$ aлY4*/21Dү0fYr M̂GsT Ȣ`;@b"V_"D4m9CZ #+5NΉ%p螝X%3PPf#ϞUgwqκ ;^dzO3y*OoG;kRt}i dxb?,Zc)K(MH=~ unSk\}4L`mA@%Q= 4KZxD-cd/QYo 콘&{ t|o AFU sbABJ2Fx@88Ϳ/Z^D SMh}4ibSxjzBEAjOasyZv, %Aqp^O,ZA۠FH"I`ci9I1h5L"~4C*&ZCsZsCģҳlMFA9U:DR*(Fe)tr%z/p%[L2Zdh9BeKn19$ /[QraX=>tP/M<1[! ᢖbIY iOP U93(* %B# 0iKD$2 RjDs$F*(2,g^T[;.J`\LXDŽ(Id AfNKØdE@F yGME`7\QoTERbF:kTYEsD4 _ )mu>C֠8֚Nz,P-tD`0mq+I1I eъ8WR~6fQ pBhјb+iB^ N0"L`k-5HCXZE-jxqQK!T TYä#(yLc 7}>hC+e/qtZg-HVqApKQRoZzT@"-rlܞb@wU8q0z858M&pݧওWK_O&&wE(LgY r}?YX*gtɱ(6`¨?~*uu6./A"|/as=c4Yi}4qtFlM:6݆[ ˜Sg* `\B:R~J-~![Bl3U4ʓtW˺Bȩ M}L6ȍmwáa R!m}JmUrCP q`9-RLp7ܲ9qa>!O\Tc]A"0@i<(VJ d9-bL `(lgɒr8S65sQGp&rD DCDzU3X%>zBىYq$\@KeReK"2UU>B6x1; t޻4ik{݉adI&~p*՞MיaMk2qy%n2JK6(}P}NюȥHߎZBwoG%e H[iڬc!yIΩ蟯.*Uud3}̋ѸJHFMQ>GQEs}Qos61SZs(V)$QHD QKLP MNjʈhA #(rHX HaG_'`E?Λvy?Yes}lrR?L6ӿ m[Vc U-{`)3kn2:Aٸkp+E,2mL2OV{~p]ɓiG>XUI mB[Ж mB[<**u* mB[Ж* mB[ЖT-UhKRT-T-UhKRT-UhKR:&SST*UJRT*IRRRT'UJRT*T*U܁1I)Sf@._ȹ*Yl$ pXȵQ*g `)%$Y\-. m ܸgwpm"Cփ=0YATQ"@\9-5\`[MUxl. cR[j=Zxg y娳%b%PE^ȹY]H>1|Z2nUzk' Z >ssr-t_NCkV£+ӮxrLhGSŐ^Hd˛)[Nt2Ļv'y6LG9`ossL3Bh5׀[>* $܀9Mp/I PIn"'Jј S]ΙNPp(( ޠ b9M `@X* @EV&!A@ƌsb5k n-+ɹ ^B 1Hnex>9\"[aOX[䣘K8pqrNf[cẐY\ΈP"ͽ)$M Zop^Ίzyvx)z9eG;ET`q_`Xk c까ѲP<^l p`zp-h@*҂ â I-YD k1"NL`BQ`zQwsz9@B*K)9 IEdT(7dX^+@2AZno ك`;?V'gZ{F" 9ޟ}kҥ+/LzJnĔ8g?+>U.:(H㲪?8+z}!1dHF8nձV`D"aHg:RP؟oiuܛU"YqX|a<"rr5dFֳG&QA c,>q5ףeCS ꉊm/~fTb OA_hS>=qÏ0* BbΛ˳Is`-!F3bN^zl킓(+۳IUlp|ZՒ+[b|yK!+alfq¢uT F0b⃲m:y1BVJN;U},Ŭq s>l|gE;8n_y"蚗 vf{?8:z}?s&Go^F`-(*(ޘ6"m2WM/, MShZ6{ jv,1sA\=;Scҽ !OpkM{pEq]Md2?UHDX-{*Ǹ3ɑڀ ]U>q'iN.U "I6E"R :h (Lp -8( ,@?mc9O4Imʼz*x~L)1 $ArNh=.8vV`',L|IQ,hL c6'H~̻l<#~NN]U)~i7KLr}gKձ_&m3B0)EΌ6.im` d$msNٷ^48 ~=[;Sx S eB.3f#]% = h(rlE4do WR{Yѯb}͛r\h"]sq_>~s6_ ` ! 6!!-N0G7G`r:-I߼\?6Ҧ {{u x|;ƌ E.ѷ|+e~3rWQK.S8ϭ?uQQ&b+,jU:93 puk|'p%p!7TG?,l6^d g=ȳay>po}A[Tv~uw/*Ŗ/6w~iSFMf6|gr6*ǵBYs7k[!q fcmp6: {q˺QLm0ގ}?5 mR^RxX}-Qac䰋QLd 1%gYJa(o9گ5rn8< w΍ucG j:8/+9 _w>1oˊ9r, J.x^YonG.:x&~}ڄƦt֔q:z*МqA)++3r,p~/A٬_@Ql 2a`u.@ⴂggO>5tX]Dc[|/@q0'LEF'ӿ1q_,|*Mn9b ?`H.׏%ۻn^_kR0cI)?KհY|5[DXq6ALQ=^d?z~ oE@k.+a#AYy((J0aLGנ{ [:_`|)i64 jɾF~u+Fˏf#r9~4dNSxF(Q$&@]H1_^tbH-Q/ GaDnäK}/kg"g jm1RCQʘ˪^y}ޥ./8bZA\]-L)x̤<7̫~iZ_'$੄FNz%'wȯ5AAF[NwB).FJFNހ:v^δ)b4W[10c`U{RH2UV_a㯿YV_~KVcOtϽ_u%6g: /8!:Gッ;9V $-9`UX;FX1' iY/ Zq61S0Z&pHQ'B0<A197vcL-ow ~G޵4$/;mA>"_H󡏙9iGZmim|v، Lӧ/\6epac'`B=G埕(v;,YVV)m`1?N|`[X2츸^CXmQGbXyRJJO p^̫ 㿝 /-#Yrl(^9/?>vRYk*Nrhqd~|OK42qőB0FEj8 s/'qpǐQ5NFC F ]OJrN?Yg''Jْp6gl,sP_6Y4k׉9ԟϗx3ۏǜ___(QG,`09`sTFu1ɮQ*yԯ#| kAhqzMބ`swkKM2QE4i$=B?E2<6,t^UϷC5.i7jʹ#JtϯZx3ͽrMde^oyy}ٮOG_zW3{"H <]Y:쏺ӣ3֫\eh% +HUܙ!LD$4BhSVu̪ȭPzRM ߏꆒɼm8yhY6۪C_^ +S3ڮL g?C#w]Hjxa(agP9dVr^k,XIEF7M̕P,pq5K~MNUQtrV|w>Ś](0߫0_ԧO]8Śt~5LއjBI`#;.'Ѭٷ&i=5?^ً^һLf/Gn?r}(7<1߫;B͕Ҭ㵯VoΙH]& Z^9l%)a?z>ΙV*Nvh*]3uPz\FL%)(ALb6pIEtPА/PѰ! qC 4cPL(IaMQr9K(R^8l9ÆP'N'sz}oTyST͹hr=rnڶϷd/D)8W!dBf E IH Sk)BB^vQu/Eqٽljzo_+B̍>rrMJE`;^X\^K/L-?^Xvw K(H1ˮw_vuڑ.K4ʢ 2V Em+'AQqzLyY ⣉4Q$xɹAsQOB B7IPFcEM' ~ar6twn&LN[:iC\r#;բ̶4qϖ[]ی{c9CUnm=%N+((aY1yY*𻓞y sA"ܐH>$"sT2$8C5U5Jk%HL`YR B".e<!0$!\V& DS#R3>1gJ*Si/w1r6CZ ю5P;īrwFR`gXP `}^HB [o?\ȏU[z`U<f_|YO W0m#? wl>#+J+!G>$O r]~~' r5 92iqz*1 |%0"Fy WѪW`R*\6#$i:Ġf91CN+D=Hv BHP?j[cQkP DJfE@V*NQQoK7(Efu!CEz 2LA GKWu|ꭲ#6oj$61r)II;NvAAEF(4AZu[Tr"X8-ܪY?$ Vo;سYJVDP˰NJ#WB%9U<: DFʙ%r8PdHqvL*0tT.g kQiܵ#'rV-z=tcG9eL+oɝ x,f!#ý]ދ~"pVQ[5%QSs  P5nLzmv>PN;]'5?$wEI7LB'ўZ`&g#%(8]g1x#:!@oM{ne^JrnWyLׄo2ܑ:Pf䀲 u0prw*J;v5`nq P6o+O=34^>a3nC*U.1A$0=4B,jmlo}z1Bjyf+~Dm.n-V-_ۛG^&WV3of>gbO=Y;e;:}ZK0CY%vπJ+z_:jN/쀗.Tfyq3C(K=Nu7#*i@V/#nNlW~(+@tGi] h'15PV-g/:nErCY .("( +?ͷwϫe qoXj ơPj9b%9p G K-@HBVLIwW]힦&1פCY #WƦdʁ Ca_(ly%l뚭QQ^vr7q(DyixY @GDќ܀mts?\ 61 RK3[J ~F:63n#9 vh=Ϣ}a<ˣzHJ&1"Bp9{N1$Y'.}b"ܩ hf *kkΤ&rHC1xΒ=o9IkYč{b?{k6gzhw?7αdծ;[a<|Džt.]K&d" \ j6[4)BBZvQr oyG/wt4p!l%N!͠%+P<|{XّNJtMC6siH% "KDFSDDrp6'FhGN;rڑ6ia`Iu=qd}W#/7;٧o.H^؆5[#+ATjsNUO囯7y3l}x"CK.K5.׬YαNVd&т&HQH׼‚E .|Uu$,P9 *Õq`DP)XhuugQUk>c#9'xf6QFZrYm$I¥@Ү#9sP)c$pPj6V;E!2abD)e^i;~O48MB{"Eш(=hh4['b2%_hzlu~|<E\^Y44%\%S qvZR+N*Ke<X9l^ۂ+H_ T b-`jʠ:pG@l,$JROԂ6++"@*պ tQIET1nq ><(D peH`!RHU`o H #ޱV:k֔z 1PKwQOvDu;lב3Qڷ4® z2V`hqչK=8n2k9C7ƟKu)W:)_uٜKqjMoSR*g^^L|67~j:Uh~p sT'۬ ` &9 i+F 2Xq0'̊Wk-D0sL8% FnbmςT$.gJFɃLU5Eb@xU4Zd(@ йcӏbw{OĮuODV 42mPXqY.ă($3!s$Cm?}JgC RiN/]qcM*Q-_gg{F% \z'~RGCz"I s"3OaTnc{c.YňV_7?T//F9IgN7]u>]stngDEvXݪELUBZR%[nj56463K,iC g0b>{ݫDmv;[edqA64Vܒ:q#a^o"o=T3R4[]Y[.|/{o?y2}ߞ 8FIS8] Xg5? 47nkXiMh—k9v1Ӎf;~rV ۯoΠv;o~DWޮ`ͯ+I9ӁU^)BVCh1_ &H3>.Pߤ~ofOGL`swkKM2QF%?QCi?$=B?E2>&>)^wuXn{(\/"*Xlq~zk6GzTV^_Bj$ZNe:]>ƋW_vΊ욚Cץ*& z-8ǹS[qɬWV7NYN΋_i%XKKcA;*Gi+m_E? I`= v`]4:O-9z0}_Q,ˢ$KE!i^= 7X.a6=[x3^ɇ/{`-nsaSlT0Nn#.Fmi8筮)s?:%[2r$LUN펉v-xV@ˇ0b?~㺶Hr5ŻH<m,L> hM蓮Jl҅O7w3$Px3VVK7ud c}Ry+g/U!y_Cft!:#Mat-}r-t` -7 |ԗ)pvk_8?rH+]ܗ.n|>?x+KQLbں{Bhŋc>-p2;ؖ()IQmSƾXجG睝G5VbnwhM"! Bgf2ïIWҟ=l-#-~kQd[C3xKkM'6DB-ME Θ _F-׵PvkmzpRvnC-[K5n&nl'j*i\J:j ?x&TtjgW%$TF ̊hXу |=uɵ=.`A(5epT)!թ t:Q_wL 쐠.}hqW#еȊ*OLo {+0)UO)Ho!ޑXH< ,0mZ}_bZc}[j-y%fX¿DEe (KA&(H5tHCȮSSEZէ፫M7FY&&R`X)P"i tuJ%'Z[Lr+D5$XG,hP˰NJ#WB%9U<: DFʙ%r8PdHN;&a a:*e޳cTv^XΊ-gG9;9OCv} !PƴA@{HH*FY[ЊkW69)G1w\uGGId]z=mے~Ax3MJOLAz)KdmJZLj䄗Q'{mMEȾA<㎠gDJײ\[[,(%>rP^<ʉ8t}C=Aٰz;|[i^R՗T%|.mDN'$y>GJ@Cy)˄Ri~.Y Yi9:_aҨ`2l 22ei3"U)(oVoȿ \*9LAeFvw3:7 u>1/2׹@:\/,s)j%>9iJ!*o+KB>|D5@?4?k8\gE|=ˑɜ:Ǔj3h Ak,7-_'<;}wPࡿOɷCx33Wܥxjtya4,0whSI.zVQ,2z3˄cq<=DDK.1uֽq2Z3kFY8/sb3)=כOCo}o3?Y7j~v0KmTt?'|pSx*Uxj8vT4+Mtキpf6 ZOFq}oWwԹӷ`ǟGK#U^? 3*S&X(7u\_?X!NqEC;5di0vj]SMG+?;l9uUKGy+3>hJ}.y΂g~~!CrPvN%Hp`F{(O|z_~2<-L>EO.8`B9q@5E4 yz>?DU*TN8S%&/~Q 92,J/FTB %&2` ܙ!LD$4BhSEi^@稌hI~nF?_eڲ|zt\:V5d[V￧n䡃by^LL\JL4VsU.Eѓ>P1@CjR<IaM21xΒ+/Wl9P'Ng?7ާy:ޭyX0(n@ Lgsz@%Jw%3seEHZWDT.j[9 S;w@|D&Q$xɹAT>!祐>E$0$F+;'^vd^ȷl><,t4O9Q57~^F—OxJl8h:;tunn.",ή( GCHּK\cI襂!!H cC"y4HD `8CWM *BC%dRye!;KuU $ڳb+m|}3Hjƚ"?`@ANД>Vɜ|@@+ǃƣ ?>+!2 R)"CCc Adm~`׆S6a$ f}d[=ʻpx1q]2L,h5T6Vm2Hz`Y,UN]6ʹHJ!tT"&HςBS!K+t"8%Ռ]4"f|ߴ1gchu7ٻ3X0<9N93] HR9gKC┡lE*$:AV R6bzQ+vwsZN%oS߄zvAe 3|,.$LP "&10L>&.V^VIXXiAa*S e= )D%L{X#4$] oG+ cxNABG,>H%dmĊEvzQUXc_iAxBCBb%#1))DXҠPFb:L:fӲ״^zMkS:a-(ŬXjZDm]UubDo})?4J#;,*zPGV9qd%4tg%iO9; ?hXbH\2 :38MTy5&R"d{W2ɪ+UY`<8u8.u>~Bq*XUL9#s'bDϵ}\]'R~LڟLoJԍM(Ļ+%,6i=LE&lNӒdۂ;wZjɒɧ5--tY>xqWj)*HwZ"dDp04,%ڝBqeÚw KNC=[W+, SF>V5e qE -b+ v^լj"A[0{Nˁd´aQ,4|8c*_NH= P Ļ,JS6P\jj;K;-`_լjrbt[Y~™DW-yԻ+4Cjٯ A3fQ"*Wĺ&l(yAۘ=8]#{8G\b)Tc.0a1"NLn faلhΟ r97˨:R")9IEd03hƝV2I (B' Ll8=VKq*?RHK~heeG `F"S+C֒Yj V#E>Qp̙A||`2Bt-x5 8LpJ# 1\Gq R^U"V};L(2g]Q  x!촥F9#6`⼰H+`)`8\nAgŽ5*j:5.,kf,n]NOg1z ?OgoMNn^LV !엿.Ѓ$BVݶKO 7(?( } D--$ͼIN@6F"rrN.Swb&pcL`oNµΫ+a0"P&Rm\:$UX]]ޢ m+04p0FЧUݚˆOb{Ynxvv:wcӠUn)h`h ERw'.S <*wMޘt~-~.y[|nzY}z10[E+AEgn0^..KR=&Q6 _1: _2ֵTl)jf]3dm3 ,N> G}&.&/>^=9L7 U6Zr] &L(Zϡ֑0%wb)>٨ ?Um *`NZ ˯37'0Go^M?~71Q 80-%Nugwgu=wozAM4W7MۡiujšvفB(x?tu]:KO^Văp%s=_nߔ@.]/{%V „(!Lfvd}|P\o=7~VGUi")@M"F Qkt2#G2 v<('}66//c_;*қ< aG (9[= Y셃/3Ebi!Eȡ$Ln9}V]zFCcsFy֖֟w=im";iE/-G笜H5%)\:qucR;aH Y@SJx-(죫V dUQ,jY\l^ ˍ}amEʯtKx$ L. ׉ͯëpMDj_a{ܾ=6d߯mUɜͯ`ۚj-"ζiQ0=@Umq.`UŘct|{# $g\Kr1<2>:#:x:wV}aճg8jBQk\iR(bA%nuX 2\*,z`G>Ma=@=^2Ll<]SLdj<wrc_wIf҉ꕚ@7ோ)[I~VZ.Ɵ˧7/*m&:{f˥āev&><x;v Onhg12wBؽ- 7„".Ҩ7dͦf=]3VMU?O\!nI`*g @+pIn>}d{w&ԯ`’n+[x٘H^@,ju{qck\~;}RKUi0}φl~,"U|{{_A rH# be}0z)#"b1h#2&"ƒ: [Oz7!v֨j+L].W1*̂-כ,YydcrՙRRA(Ŝ(aHz| HZ,\SAӱ pn@ 5f< zy4zY߽f^1{s i434>. 9' Uzd!‹%.jzw[3yp3y^)mFhW?܍?e٨\7EгJ pgםYR-]_/rc"+>H~m+h),!|JCGzL:WMT؆u+?*~Yݬ$~V}`X7P 6-ww4g@Y+f/g/d0/񬫷ABpƒ|f[Z]^$Ym0m~uE 0U@^&5mtZ \$-(čMK;-D,m7kwY%ub2x@B m=X^F˭tZ(yӟfzZՌ u2 jS?iՒ'QwZ,3vİ@E!iK6kd;v-eX ."e`֯|iV+[0n;J s-CN_ gN W3.h 8BDam6s.oO4'vNwɃmAJ;5e1[M(sEFe^ΫU tZDWg77m櫆Im[ nGpٴJ0^`q\l Or}^^~."AFU 9szD6{<ʈ8:pkJfisj V"AjOG'.h'MDIN aOEAj"ɹ<RgV"BJ'Mya`"ZA۠FH"I`ci9I1X5JDGɆ({PgkգٳKE`8Wzt5yRekJ\>e7| kFwc9Ml+N_m']T,jB)%6i.JJJ#Ũ,E`PTNGE`-J-2(knch`,^Nc4L`xACa~iQ ]K-Œkaf)=AF̠h$0LO["$"VD)59]#URMֳƺzWfu 5MAvQp΄tVR/C2j4IQAH7! ] ~-2;y)c#HH­H bBW.pJ[eF؇Z _'뤙 hJ#iB|)`=Ǖ$뤉zшXWU1fcmQi ~ns"|N:aN.,"+_^j$jC.nQ pBhјb+i.J;0Iem@ơ8AOE"OU>ְJvPKE3p$Z° ȧW0l9Xg5LJ*1rbќẑ܍nv46Ca&$9VgO+xd26sUD{Kׂ=Dxz`;hOkt?^)XY?~89-5_J@rSlIu91g\TsAߐo{sn`؇ʄGg21eǤ?La ^3)VU ^e[}~28=C4_y]6MG=8T 3LI4VOЂC(*@z;< 3${J*l>ea@bXZ\ AQnRCݺ\$5q hO Os(/r-c1 %.17Iztڬ 0EFv/C<]6oՒCF"=saIIQN)93Z't\#gI^H 92gzɑ_eg.@cm4нAGlKHvU7:,JIRdISLf_/ r D暠kP"D09. PS0#'S WS)qZF KQZM8W/^M_l7B35٘&wLk-o}o<]yJw%˔<݉yNs Illֹ18o9iƣ*Xep*(MD^-.ss&bI/y dRNAD`9]`e\CvEkssY"3chyۤ;."Ĺ'7pW(J`_ >|:Z4& HФhe,Q!AfҤdJEgz#Y[rAZ[0$dO,9FmvI\ym WފM1"ǐB:`sɻO;lkdzTqͥ{Rޯ|ì@" z)|9~BREx gxtPb'Ar2䠬ԨRQl((ax3e\!P8 Y#ڹ{s)Op%cPPh,op69MΗg? e@ۤ}B٥"p,<#?MxƝNGl 錐I {>avw^>M`tx-6BTcS<1ybÝU7o4){|<]#mj,C;.oHhRZoFdž׻.~)N mAx2fmHLI:s [&#@{~Ύm1OGmv \fAbj|It7rCģ*JG ޢS[r@f!Tғ|"ƜnrW o_˭OtmPנ@AzEO*31Ej,XL#g|}¿i"=_Ay\ʍ@B#q0.o, C '[2Xsr@ zDb eHчP*n"L2'^N: PYHAn2sM%ͪ_U=<)]VψL;:pMJ]£F?+^~JE!! Y:a6p8a$;*~Տxbor㉋@q32%SXC$7ytsvRp}OeYj<azVu4mMN)ʡmmʒV]c*żsEA#_+ڵvGTdt$tLnkR/qz' yyb|Smnh$ AHQ&j.L؞Y8s̢}؀ځe_e*ər eKE-PبגRVEh|/UĪU{E:,gL $wY^fE.RP.D6\rC>tcjZմiԴְ=(Ŝn|Z/PwEy1Iؼ^-lrdVGʹ{*ƹ۠Dр^RmCSG^M]z8Q94Vo16{D,Ch3i1hJQ2+z~)#.y,2#::7N8 Zx ̂V"Y1 Vdob/v\6!./݂=&E9j|NYK:&pڻl4iem Fw۞B="% `]୷AJ:PF 8c k8[f< USGSR2 rMd<;:#'f"E'^V}M!I t.:IULGd^"'97D”6%01+ Lpj(W`4a<.c9XT;Sf˙4tIa6Xͽ.ICxc:VuNW.|*˂kr5EZ LeK.j6.6J}=Jyk7AH.wX? ,"3PE@&gJnjh e4E*$闂D!~1HSrR$冑0v84A&t"eAH*Ņ,̥lbYPLuT$$wژE oކ-[cD,THkd^lNB[4'9p  Vs gR먏j^A˲M(n s*k`B] y5EhNѴc=O?> * ]O=]w~k6m}C.~oi?a:nuu3i{xC6EX?mԲvwnz|xs*[r<fwo?62v)R} \x5Y sќ=D m^2`o6wOF5̘bdFw; sxEjv;݆+^r~~G;vZX5ҋvdtFV>e0Io * gTmDlfх6n %O$(B(iӇjwI Bk2ifҽBZ0vu6qrHh0lzHf/jvw0#śp<1g^܎|yA&7Iv4j)zHvy4Fly_=y W b^u)ʍ>~aPkdq} *&m]>ˇb.󇆤mn]O_}l.>|(jZL.? !ݜ00/3p!ooMr-0Nf&>a}&EmYawaD!\j`y1h-DU\2ޙ"r"UJf2O6y!kkyУ%<$) ]""tyd#PJ $R/M\Bޠ.פp(Fa9´$'@EgM)U=)jЛY bpOI-~!NIi uheFsLb9$H `^XGJK49$Y JqAĿb:RL& a D!L`i&I|{kҐ9l+1UU}3gӜL[J,C e xLQl2]F-K=\IN^~,ZDb[ΐf8DoM. ʜA3d>*0̟[ A\v4q]~@!}_@tAC ;r:s甇L`}[kJ wK6]K8[/qzdV$ N3W^G,]Dk{=\:;_PRAs78W'<?V(í1 \'p}C5v\\ӬuQ7.FX~On..{} SwzV-Wc$*ᇋi>bp0|-ie#%!WS4ad%d' mO8y*>/=wtN0H;DZPH&,q0 TdѥZꝇza䎇* E蝟tɻ÷GoN'oPfN}xr 8L!ibӵYm-:cpNzIL$椹 mh t-mv-5JWl/,J8?.>\|y3o{6<INW&ZĽ "οlBI1fRqu4fIATay#f#ڀJc&o}5}}tMWȺ=^.@ -eNj Qp6()Q)FXBSvWgWK4IP  ֗g'g8j`3/7e~В5]{)ݕWO- ǔqAxKGܟ")|~-hԳ៻Bw. d.04%:\J+剥^6AeR\evB8'GKmioԫ_sVMd^Db<  I*u.8@$ 셷ᝡAvzS ᖛ<_boZk\Cל+޾َ} !]w-g6\o)M <HJxvi+M:l)rl2 ђ-^j'9UF :E{46x.zp.Yu/H kU.*- KC$Ec#dHі_kG^|M?vU̻91$0ˌ F6iE Iq 5Z ?bd/);)2h"PNh.IT"CJ4e`JS༒>Ƽ<2l$Q@qm lg0zaF1؅HQ XyrϧԼɈ&>FD %\b%nC1(T4;퐌E떵ͧGx釰WiΠ;p[@UìI̓ǐ}ot1vU Uڷ~ ^y&G >`4\a6/_QL.ưg_zuq;\K~5\T e**jA*o٫ړc /TAx!/WFJBgM;֡|xA^ Iwpf_/ۯX¢~ίUٜEu~gz1Nj{_~*:/xP28K\渂^o9CO3)75GzݛJi|ʆRivHuƔBYz0JHa$hN>,Ȉ@^ )k hFIRLE^k 1na&Q)ML NjS2 @M$hm-#y`ػPG_.w ڧp{fLvRs۪ tӳҡzυ wp.gS$,0 g>m*iHqwVTY.9+Eg pTkdPܡP@\qmsY_Gij fZ{MGH-D5`STFv޲ܒe)ـS}pWbmи~YNIɔ`YFP$dZh<.|oP bKikq&p1FCFNk듪Ϗrv'eh2 n)ŭj[[)= ٕ11q L0Rd=PFp;J% ,{zt-8}^dYCf哺rf[nP~ߛL:2< 0}3β^J%Q3uBlOMLXVm7د_%sx#@)ŀpE! $# A;׆rel/c_YC QZ @)8Iiʣ,eA#@RVi*B:Ԇ5mR^m@(W? ) *uU"ԻuQm^A-l$N? <)cK dJI,S4hKEC4o)\bBڗ3-?mziv=gg<ɝr:Ԝu848d$ 5'4 gOqi(.Rtymd <ËyO<i\/~uR9\9 @9‰d.Y+$S˵^)ԛIm*0%OAhtPX!5:g- YОDN0XyPg3ܩrb <Br 1cX;…(FURT [ܻ['ƌTH!!J (2(w(G1ҦF^(II A ?BܙAxC(E@Ifu`j lewcj:9wuPߔK@ 8O֤RA lL #ɏPYpeG#״_q"d}MrRѡ " 뜓ΩVS:n2_v&STm97mT.k*ዣ.A:m`5p`7^,IA0f)UNjy"}tݏpYJT-zi9-Aw:8N nINxP+$9 Q[c]N]}C5v\\ӬuQ7.FX~On..{} SwzV-Wc$*ᇋć8ۇkAHVJJB4h"J,O#Qp(U|4_zrE7M`*#wSwjS_Q%k:; 0LXa4(FjK;[ PU8'ַ;?wo߿?ޜO޼;:̜\qBEkZ u?+̆p]f9O2SKN;*?-h%r{b1(WOY:6:^>/1'BȻ^o8ɵϮn떾Y_O5&J=O \llŸ=Rcšk5%FTͶCZgBZZ{b5q !ѪP% jgW%Y(ךY̳]tn:{8w:|  n_ZvN<)nruJN-]Ovݻ.cNFF1'ս@hs׹׏z>e%u=ܺjZo?͖ 7nlb [Nyg-f=/o7Ǽ rI<'oxRe^VF!nƚOtئZnsʬ5}Z/:\zX2Ѳ@W @ÐW >fb'^z./7燄Sq:m e~TP)moh~Q NQ)T墦LoQrVK̜6G/o&+> A6uQhk݇Ieky}ٴ|7㻹WZ7L|\~l՟~_{MOTg ~to{ӚJq0=~AÏ5lKzv5>l{_M7 ]R~+XR  7?Hy%2uX|Žko,Nۻ_oU/Vb!`i8 7?:WMV|zķSqk#?Om"Ov>ٿlRD>q)wZ`tY'7ܴRnĥTL$А`F[8(JIaJQ1xΒj=+!7ydU@3yGn!{E [l?ZOQvO`àkbF\G.kI f'˺5ruyfB175,Ub[67n;h݉,ۖqY'_LB:-&-4~a|-ɿaC3::A= 3B.>2 c{Klr¡G7{zϜ4<2I`i )$wYg&;5l2 dcﴋgNYYg"J}틾&37Cg9P^ u'B7 NJ*RpJs.d<KjK%RJÌY1x4sfrv L 6x-{x]p,u^ ¦VZoCAg$S΢,PylF,O$*x+WFX_VLt'[tޢlP*ū+ G% hFr! & Jp8eLQ!H1)MٯR+t}6|JNy4.jg1)PuX8EF-]A9af֡m!&M~=帟9̋)3}r Z͛ML(\$%81t #H i-:ʭS*9UԺb!Tll9rcwTF)-G2,0RȕPFI0;: DFʙ%r8PdHqvL*0tT.=]tJ rV}ai60j!iIi5} !PƴA\ڀ7(j22ܛH@̭jNE>3s>(Z@Lb\s\bAAxPk.8FACvvRg'Tjk!1 x$(Jp t$a"rė9 :rHI8(vޕWRq=XDlӞDDҸOF4pEyI*}JNx L*I[9Җ+h\Q[+|PX`Zxx8A+:+4òC$YP+J|4汝xent}Cz{>46&0WP9d)D Bx`] 6װ=}T;!QNwRW)Hɍ<es(&AEe (KA&(HAx.MQh˅sa}IB8$ȘT&H@xFhN]@ϵf4~iXGHm>GϵK]o~OUkr*a J(7Caw&I ?qp|Ւj}T 0u_MORwO4N4nGUs+-@ Lu>ʫS-,"T'r™*1TW767Te]g-cSAgS~ :^BQ %(at\`RxFLQS y6Z9i]Rd؈I:`nC Q?]~o2 1:l]iZgW떾Y_O5q?J㞗K%5|17T믯~O:A!l=>7?e#c }pK/VYs_;|䖧t? ^PoiQ.|Z= H] Z^9l%)a F_ܱ/qȎ]rBsE3%6*l:4&.K\,q;= &~)yFi7NR1bi_7_Ē p,2Cj6&=!2EH.2 gx;AoAvOFSkP.69]7 ~j,Es)|7go8G< 󃻻A9"&hs#WzfG2g}W-cG勫5zoox?"~ku,HC J*@Cr^ДB"~}‘Kg[={9YΑW`U+h m`0İj2ӭL]eBu-m!._ؼS4>p2{(Wnp< <ӝEq֎t?ߡI@ZXh.P፧:c\k\ "8O` y*ShUd)s6cҧ -|Մ;rMNmQ69睜Yk6/Ó9^,sXȔ#ESFHL.U/%=Se#3uo0ҝ2-Iyyp5rBS([#@TisT\T=Ez k]hQ誧<} v˼/rH,kpDgiuߏ.w92Rb{xNQ浒 S<_ѧR$v] )K$)P4'c,&W#y"ӽAo] A1"RKrD~a`:L' D菣#d|Is_!Z橏7q*N)}bm:*2U[ۏvj)O'7JUZ6bw4Ԫ ESޓѰ$Jjh?ETR]6k7P΋[߷V)%ˤ!qDEs2q,؈I`U Jn) RTy-5(mrґOV{Ro,Ǿ>zX_O2k)ƾ pgY.p'!(fJhmc"^6d-}AK"_wZP@)4= "ED] _XģMm ΆXc/cFSB4PNM򠀥 $eTW}[E纼m`aCZiF#b$!ax/*ѫ: /Aܻ`  5j^YByWfEb#-wW>a4|O`W% _Ҳb|B'RW2Z^N^fFA--N_HB~*nPؿsZó{G>WPq" S|r86 hҒ'L"ŷw̴^zT3$, >iMjdj5!J!f x!(˳ lB)$]#&ue4M?S !o_<,yΣ\A3`UTJI9I}Mltk 9C҂('45%nБH)XfҢ+민Üϻ$O${KxvZnhP+=usǥO{7T.s ~9}Kǫyˆ b4='~ Rwv>][nkIT ߯E/n?~BFRq$Gh54,!|̫pп-x1f7Y39`qTF֏:dӨMsE|qS3߇>w}50Yl#Z qlqSȚN%igq{sdP{/<)e=}ˏqTLJGM$p3 ܉e ߷z &;͆Z6g]/q 60Yʙʭo?Oxt>ZszsozkTi9`')͉f%-5LNoTh'j(MZUM?E2<mɼ_YxJ`\H Ƥ1:A  #+dt HՕnPg'W}IO<0 K]=4,F({M,`"B<^гg\N W䫳8)fN2sW546clB@uYGw՜ZtM86t4Csګ Fv;j^r䆍AZeeCnXU?T2"QT_㠘` F2@q+)߽^|| ,L֔{]vW]->OMgXdMۆraKFԻʶҢr]YZW^z2X](v%{;b<*P!>zL>rr9λ;d1/$4W_P`ZhVb7oJBrO7&(ҕEV'4J@fVDtx8r&4{ga<1¿.³E|Wt 1 _4x3EL9Hs=Sa/g*t_Ew1ȥ˷|;$TI]0tX4!p|u5躢{ZԜs"7JD8TRvGkurNʔgSn5ъ=39* 騟6og|k5Z ou{9D%m{1mx~C7Yw㺬Ԛf&}ΚϚn9[lSli0M +ϭJ|Dj)iqkY.rٝ_/!.{> y(.bt1:0ǃ@$SBe80(3Y`jDzMĴV:E3AGan0dnG12Iu,Za "s 5)AY=!.;JK:цuY_ꕛ;;^xqʋ݉ぷF_.;2 NdHH1(]x'ՖJjɳclԣ g&gЛߧʭS+/G8ZA5Z4^V[Bp7%Dj)K1/0o񵊉&('>rq1X:IF!©̝gyNM)$ ϪyRYO 2F56 VH Hi# i@G:ZUgJsBsRLsRi,>:I1Us;Nwj*b*$v1!N$&ءhJJXtBS(ܺ1F,O$2xK̗FX_VL$|G1oPՅ{wjҴ@[iZ}ga) N܋I9E YvyCrX8EFm U[l32tV*~饺vSE5 QÍ'L\QV*(61!HFIJq)P"iu[Tr"u֐ճ!XO,P˰NJ#WB%9U<: DFZř%r8PdHQO;&a a:*e޳cTZnb og%ڨڭF{RyHh1xP P0JQ@V3&#CD3w[w _^ ׂz 9uFrN_jw"2q  eFui4~'u~ҭr{b T\ \ތEӑt)Ѿ XysFJ:ъدHxn6-mԴ6+=1Q4.)ǓQ( \hx$>Fd%'&mގvu/z/kE >h|Po x8A/:+4Ӳ h 6gA)(Ӏvhscl:}<et׸ z5Ί~sftlRVMw>12$r , | [J,G096:)dm*:Do~ɛYd$%Y(r9Jo>Lcݺ,KO^] w ՑջY2ʆi B?_Z0K60b@/J/vT8a6kl $Lq)4x>m`Uѱ$T$XQ7Vc$;{F$E9$Qf}h{f3sLw\ٳXhF$fYff]IYc"}i  `ʥGTT+gХld1nq0M^gbL.3<;C$MFGoqݝ(U-}*h*ZV}b>-^4`b"Ò(,jU͡{tB 7>U&(\oCS0$F䋀Nm kA栢Ҽ&]!$[,,͢mUPBi>:/ lvS*"PԔ@ȥ+Jr•G>8,)LYŪ|  )hT*&naK-da|gӅZbe`yh jDfL7gҞu1^0> PHk)ʨ@EOzѩAyӚyCҫ2"[f'meH0*g^st99j ZVQm.b.bXu=OiI0:WF|_'~zytN5(fJ[t`?ݵaAi!RXZThw-D*"_2}$CN_<=v`R1IƆZlnVR(77HctxS-7zWOPr jK&琔+$л Y[ir9'Hlv3e8>zcKg=s~h]aG'Om%PT"aFAAMSAHې|DItj[ r$%S9..@-fi)\]qʿ43}_yI1{rǠV+xW͆w2[$nwv={^96춣} gd$$EFM>f`rLm`Qե-ۘnlӫ>,Oz!^u3|UE}<$/U^ݏ6j3Ntݼ( u(-T|D-҃QODOFk߅ǷS +լ+{9k9QjG'SQ (QzASK+R,k{*7](wjv1AxּpAf1cSN\RQ4{M)u%*;6JDQ||6z3d"L7YFf:t݌Ƴ*rUA4 O}6VPV__M_ɝqr:>j2'7t]i6rD햨>p9~+rF\b"梙(UwGU!V\Tל3p6s.CTmz[_PK~~r+mr7I} 0^^|Wv\( 0՜a(tJuB^x[jq{r؅AqFK>c]H6!XtJ`3#dL"YUK-KV&B>b.i[BžziјG)u(d£J%cB KΟ”JM7u:8 utAc-NzJ)mv/ykpLjƢ8s.HYY}:ɞIQ ּlhj+NKykJ-ZXkn-W췙=i,.xr{4: iR , ]Y1Ov$:oĔ:p@ϡg o ]̪Nl4)+ IwIXQ(ƃѮt1!3Ń6D9I8Tɇl^D Md":ԬE[a4UQd p! ~VP(R[ɦ _ފMVxtDyTƤ * DR)58~k#1`!IؤyY 8[:>WAzJ9JY"$A/KZC1l!$#*$eC505N\6P +| ͒iucH0&ɲ-I( X*t;E!Kfl$Ât5Cfr7JEciD$1cr;l JY>#Zl .HK=)Րj~־MB۴X ^8a5\ӧ88cΦ]4l.( 0M$ M` 9-U_oI "3E֞(c fPƦBY6W2[aJ1(ezw#9PWsTf:CuuUWlѕ C$}t"uLNpb,f)`>Hfϫl[wv-ZcI;(Qn,O8?܎ qǒQuP*}npFzvo]L49Ȝw`eg$o"61DՈ?M"-Xp,80 N%K+U#e b*% fn-;):$ μ͂k}L3oSט+t?Kܓ-=Yj:tR`tAzr<FM^~t֚bQ5zhH fѫA}wi<.'qMt!FH@VdVȢJF9%E%֋s!DL9sQ&*j TZИ@90!cz-Ry( IblmaE52{bCjuwH %={>zO 4>ӗXS@9YDbL E8^k ْtpJ{iJ%/>@8oK$ &`R&YJķ{f-gx׏sw{n./fe*m{mE~q鱄(n`VxYO7=$HE۔38*"ˬD-e[g2LJZYSI.JV% Iu2e{O O 2lԷczX:ߌɬ˽|im7XP-3,Oz<=e@=a?|bYBW{&(>(vg41ǂ^]*J7*ڵ|7 Ȏm*F +#b2!T!ӑgMIKfEf\OH[ܜ[ɬ-gQ DXdULB&u,zL`ȤdA&Qe7'sʭ ꢊ!]VFMHoAM5S[V jczo0&/igUUg6'q7kc ^ >xbHVפlTǔPRH-ɍ89sVkIL*N&SfW'47:! A1h\ep|vXhb|){)ϙ7aDf+f/O#_};^<HAc-;I[[.@X9Fgc63-V˫?\u=?/ 2\_28;{$Gu[7F/ D۝itvz6w&LXhx=/.kj|ȶ=+dg:ZHaq *8'rp=-Xa&Q8h#Iu+6sp럎?l57*F w9fW r-1sE\=73QxkLng6V"^rOA/Lq&*~IDX-w+3vڀ5i'!;7MoaC= )M"F Qkt2p#G2 v<(' W:}\_*(1ì(9[= YƗŢVQ1"otWTĆTvOմ`ö@zӯKOS3@3k4Lo^.LJJ-Ѣ2%>c2 5tuf92MsC D[u?PVrg1c`*2f)W_ogסvët EڣgaPLd Er@0ɕgoW_AGho]a9=l]vw$i6c0PR~-8k{[K`ho5z"O13HKl0$J? `J1$^#f aϹ`1X9]c13iH)EԒ)E 0s2bmGPOr*_A7&k}-4GOj̕_%;mԁĭXZaYZ3RJe[\v2|ƟSyJ`$E V| ^* u 9"F눤X9:ǐ!D!e!x0k5f,`OQFMFS.1:#gTpCVp=Ei*)$CzO?ކrJ+?`+?G{D-iFbռ}W&W&y_yWǝWQ}Аj)`ˡ7|Q.*/sxJΔ>2TEI)/*}9/Siߥ79;*3PGuYe]/l>%nK}x] ^V' M']wo3Zm95P0D0tdWl\n?tTyG/1q+ k> r'AFU %sĕW+w'm5jUĞ>G6)&95Ty!,k̓mOJ+P$I ( e5BIK˙eFH t}+rV/'OGsE'bS-? =[ʖ(>(>:DR*(Fe)tr%z/p%[L2ZdP*Kng[W3J*{$VXb4L`xAC4(n.PɥbIY iOP S93(* %B# 0iKD$2 RjDs$F* M;YSFM{R88b1!JR/{CP0&rF.#ۄ<#v"|y)(5?UhرZF$b,:(bBW`.pJ[eF]m]-78)I-$:@ HJ#i) Jou2Dq_Wlk$]ȁ#FA_{Ѵ~ gHȗJen~Q.dw:׽IGm9KMӒt/I*ٛӓ`|X%OY$*!-)d: +j.4"M*FtڻT,GXz/e0.B^ R3/ 2dkkC$@ L` i0D9DŽRn\Z(rn #"nw9lqBYqX+B"IPg@C74K8J }Zj^rn\t>74 Z[Uk:ܲspdg0Bgt  4Y"?dZ4YR|5&w\бa1sq1hXl㽕j^+?)~Y`^bRR\F(PJxR7@g`ѳQӉ\BN2j:QyVӯQMK빚5*vڋx;6LTZj l9`bG$2è*b|=p7wy u<7/uŇk`exd/ e7 S Kk0;ɔ"A&kgW8^_Ȏm:.tsuGrn-gMY^;vRg2_^wLS현|,nqHywǹa;.G/o3wLR6Mjrș53暬q&Mf`2}sfݴN҂NDtdkS龎 g<P^5;ގN:X}rr5[HsHQ;$ S&D;bdEG" ӣD;SGXֵMg9̢*EdDRʠ I]&/c"F4,G6w=f?h?ylFƃ `SAPœQqLaό6RC HP=V*lAJlB p$LViSRk Fzq)!"gс8.0-+0&?RaR\`\JcE:m-VQ /_!s@ _57b GSszGDJNcRɜRafEь;$x,`@H`"8#ZbHq G|=-K) u+H,s"-06Qdκ@@20$*vR K P1 ء_MB'YZƝUīK?Il)"=#=?gO[T@.L;8JH#|*Û<E,.Q~ L\ث^7v}.nikPV)D0Ktp*3 ۿ>O@^.kJ`pUC(X=[.*OF0o=w@$ I8ch| bNẸ4^OM{:::\\.~UbӠU݁[.r1| 5S5Q 0:Qil>7]Է%FX-Z_P]x{59k8^t!F3b.̷ {gR=ۅEb0|TecfBm==]uCZa58,2'ZJ? YLV|8_O'zCI"Q.B[cW9LQ5J{ϺƘ{!d 9Y9CZ5[h9Jg;!H_޶ՑUu~8w?>ŷ*$M|'J5%UEU bh׊vcl$JJk*M ge,9z, bUS`P) d:# ճlZQ&>q $1Xk]ԒHLس1M&?XφsK=;CcKvȱ+E :7FB1DH%'W2:ݛLU)/[YvS/Oe -׿O-S*5vca @?KԜO^_q0^DI %' vZ̓ft`k9SbhzjB`:0U^%* )3[ 5b8Vʨ5DCM|mLbHbA$n}YBLC w.%am<"*[ ŹE ["wU>|봺[;FY\ muXyS{E*6)p}U-Ocz|v(1lU;` tjULGΫ?xQr뀄96nOqBlR^-^Hײ.~vb޲J26y+S:ylv)>H<{54(f(lj\`w'l2[&Tk,e/68Q xև`!i! Db8S,۞XcEƷѵis3U =]=z,8z|Cs= Szo 1C@LƁ1|):}9OitG)LH{'!FUBe#N(.h'`auC(\`!Ql\mܣrTk,[_mɁ-c]8ʏ?4v;sヌ|߂վk,|'\Afbo@ӵl*|wޞ*)Zfuy[Y̕(vkۙ87~8YiY қ^Ϳ2Vc\2:@%j+&vу;Mx@G5ɑ.V(|F`GSjQ)rb+ MB,;j&!T.&ahiu<m\ ܸ}}DuO׵'~h#rѾ;[!Ex羆(3.>huqmC#6^D^F^^Ŧ򢇏/zu0UzE_: ۧ{9g۫_9~w|xK[Wܯjh3y/z=;ki?_a1ϗ].GVӶy0r~f{b1"H5XXtuyſ s6 _ґ3&2%Lkjbr勍c^,՚E!`suL^%i!&]![Qq:fv>Ktl١)too'''aIŶZ3庄UhRZgCpiɿnK*un /O/W諗whC$iDpض9u98bܖ~^h:}Z6KDġ;=SyX.K~|96~1N߽4/)O:@;/JtE?n7dR9Ez|*EcƁ+XtZ6_l>jI[p,`RH[MC^]0ċ^!T^qSDȉt&'&};J|ujoTS6 bjAI[η*E24Ɋ38rA@or>%kK1 Ym pUkVRac9aAz"r?Ǖ=ky37|#jV5M'e Tow6n3 ~ԗR#98j  Rʏ OMWbhD@"|(H exE9`F"B* fbh-()nNJ} $cUCvә7u`37 P՛ }}G~DT\Ϗ%Kҡ_u)an6B03i^ l׬x_6[zChj8mqjfvO9Mk4Pӽ`6,+*oL@ZMDFbo.qcΜJeQ"m(uWD𩀋Jm*Yit$u.T rnaLhst DՇӷ_VB<`ofW)fFjC2ӯÕ Œx 9R/_|52sˎ,Jjx.]-AuzNiyw(BۇL mPtM~Zm7MI'xr^[W| 7;wM,j1Vkc6TWڈ]89JE>[T HĹ2D)#}畩fq?\>O R4xysuٷ:\zLۇ/wVnH`m TNgVc,P9 VsAIla-`̅SBN J3TtRAYU BB5EYĜGhO.Yf{B:FJoa*o]rI (hRZ>G WMc[賦͚6k}jڣְ{P-} u#z.Î+Y,Hn*INkﲲXyzrt$r}Z&율`/9I BjSʾSѯl+Ϩ ;¹k m8a+ZNՈRHF2$cJ#Aӄ{mAb-ʒs#ciT3XSLד\1GȖ~QER2N6 0BQHY?fpr|02B$o.P>-j o E-T^e]t[&%,36^34zϱ kEEszsSN}Eň\#VT+Ĺed֔єkJP,x( 62V(H3({Hi^`)_M ݆Pq"&XۊQqrwOMM/jKI!Tf tXRY c:v\KX$h3[GAbIIڌ=,*lQf; ;ϸ=m]{e]܆[gܓ;0v,)}2V;Ye2KϧR4 |ɱsnns7m 3!P{z^ʫ+N_NOZ/'' qQ? :#1~C^QBآ'Ql\mȆ,GFP7Ez\m"',Bft׺#K5FBr3l#+ԒZkJ>z87&vShuj7u-+ͭ!e {} 4רEZ.K> B5segKJQz{"VY-˛@!fcPmM. ޮ3M[c9y9'o]J9%tf;憆;{^nd~SYw)PGP'wL-}UX9j]g,H>e눛ԲjS/@\m(T Ͳf 2$̐3ksnK|@>p㹘6y3zRl[QCΤMa@:¹/%13N.<9}s<9}ޝWyLhNB"{&sL,`IHheJӊ`)VEew<0$2wsQF}T )\:fENǹ$j36F|6j)ᅍ팗B0/-/+/ST6-2kcL>9ݏ,{OT|u>8v#24#2^(  ¨:%ƌmBrHm9[Nk{҉@X% 7#Cp$)@XP" i Bzed,kaOܗ-Z#YSIN۬4HZi.@I%5!iTockZHkO4y qYV$;>jDmhރaWhS6ʬeC 1 = e*CJ6h@ ȋݑZNu*cfɮLJ(K~Ɩy0B`b6CH 絑^_6}7̽GwKY鹹Ò[ 'A$kse뽐ހBڅT,Xs/|6Ahr@mT2XKē# $ )Fq~.S?M4<4Xw9 nm"i,8a,+ E"XeEjK?o<_0zA[!LʈU2L"9R@,lK#-K=[M?oTG*Ӄ.0K%VIɵ ĜxO _iB>|>p1ŎV{ο΍Hݢ~lZNӢ#/F0%Yة 9Ԑ)֘Ԇ߷$')B*s#6`N֝IbO~?v|5}bY1[A!\~u/&k vKNN0mRغ\;#]F9[;f@SP'ZEBcv C!֎*Qg\7꺹Kj>ꨜ0F_P w 8g3A'6thO6۟| ݛ+ڎ$vNjc_?…r_~8@Bl _M?^Ef wz &ƛ w&g=]ZMNy͸OC2fkc"ϷǿtGS8-w*GA$;q"`~=&ufW3k*͸Yb%:,Ĉ f4#*,'qω"~hMoj;'P`ℍD 60#`ZlIkuvgn9:@ 2 ކϫj~# S(5YK {#Ghk`&Thn7s3Jۛc؞xikvGŤUm҆%cF2LG`Rs‘9zs+://}Zz5&xF@WLHdӞGO{J6}50ӞGǕ> > i/n>nKNDeWߣMWmf Uu1WY2pMJJmer#Ϳx2Wy6?Sȥ=~`qЫyU,Kb-R|3ᢨ~JΘ1jymZ wV|믱 2m `|Z)/]Ue5ޅXfO,_"@ (U}Q5bW- Q)H̘2pIeڒQ CMd*0\fI'Xm^ׯN°{3=-kΧv}"W6=b)̟5`xY7G1 1NE}4d ب5IPϵS Ch7Ey z|yw#nIսТOSTlTn|wլ"٨ֿ[d@f #˺E\X-(`L{`Zw^M"J,ڴ EANdCҶ W0=m `rcr\-{YޛcuK+:3MefDj$LƇLgfؽBl̜63I Yd)yyg&OLQ5)@(95%[:gSLUؓz'fo'T; { EvH2ҋl]S  l2da2&iq @6ticm0TŘP@]IJ$eާbtO9.]Ww&q{憇:<]6u9'g["Vy3ѯG(> w@IX`+'j{(I= K I`{P ,*}(Jkľ*ӽz t ؜6RvwkZ_3!}.լM},p`>S{L#*>U)}g'Tfy@? c:)a- iѯmfoP&8լ6Ll:cte%68*!W%p)W(d>X@!#QuS-u9ck -.+iã $뗼v n`.|Gg;Ό:onFfNE(#sB pyd)$C">y=0F#mQID笉:xDb.&gdKGĨ=JblҝPP,Oޠ,PҤ$ dJ, m0JPcؙ8۴-oŅ>(/hyn?E>_n6§~MhVz0;j ǫ;«{v[bo8Q )N|Cl74u- N_]Q /mt= }q' *+x!x!'A%aEXkƵ˖ R%QBF" 6].D2e)J&^x8`ʌ )+1Lqg^Nynh.ótuqoLjd>(Zd-H:MP0g:ږZX ]6&m2I$ [}Hs!I,9 ]u%v 쏥t9@?KZl!&#)*$ecfh 5N\|5 ҅nB|gRc6@1Iے^ ʻR;E!Kv56Bg9 NAl:(b dIFDI  Hb- > os`a}8+Й8[rHHZC_xv3ڸkI z}+[/89h7 u/ә,!Vq I6}3a)z:CgbJ|0wv1ݴAk #sAW (u(b1FmB='׎Ep(hEA K'ظږɖ#<lFU-:K ƨB=G)kj !W; HmGu%hp2ϞMau{rs~V)qۭ95qx>E1n 4*]*\l )Q,zB{ĞU {Lў2{ʢFU2Y}Xb)Gp"DS\@E($Z MA"U AzMg1G,"tk;g˚I\L^}FW% !~a/s+w͵ίYxMlz=ͺГ.ThJJg Ƙ@Fp|%_ȖXQt[)G 6HX`XD %'btN)F6D;a g}v\1oT:wۢ6m(|O\;3Eil܀rf}@1u26_uOiIWU<Xȼo)#)_9^YWhl|pc;xG75Eyᣗ(G:đF$U9W.Yba1h]T{Gqd;2tv+K~zO;4:|6:bB˅AFWQIp(E[jr dtq\b|\(-E L7~ U 8&GO]v%&J'vN:bq⹩yWWѓ(]m([_s7@;\]6,Z}/8N֓G.=I57.U{Ee%SfPJF@9mS faV1hb 'Px)gR'|xXJ%FQ?Yslz*%DU%:VZR6BvR]{ق^_ؙf<ʎ}}•Df5dz~>Qwz5MŠ~A_c?kE^a;ETZU- Bv*dbiCQ-Hw]tؕ8{8r_vgT:ڪ=US78xe&ܤɱ$U|},ǻQ}l>Toc"R!2eF1DsLȒNOԲIHo6azK{IK;h {X@5ybgڅy5+}n l8V\(4 UZ4 Uf,8ᵵ/B:/;g~a$@e}6^8Q o;Aٔ6z(検%%ʵ^2D `RR$dC Maښ80)ٵn%')*1$G'@xi<*#j-OD9+bHleB;QuqE)oBT %#b3!VIjDkڱlJP2s!ukHOJvm9Rk:I+RNVr!S}B^ALJ 2bOuYELkVg ggr; y?oM7T.wU7xWE6GG,z2H#`ܵG0ϳ@Z@5Gs PBI!PGYNG8y^MgϏhŹދU_$Eø$ B,jfU|)2Ly] n<"??.gׁx= st>:\Zh0yY=jp4t'g6 w:t6 ۬+˯lF?Qx4Zgޕ6r$Bb]̌ˀ3bmcvFjdRv/oduȔXѺdVdU_d1nX mb뮺nZ^uRDBFʅ%ϟǗL}5IqqxvbܙaRE;ч~???}/ߓ݁i dޮ#$^>Υ|/"&{^4҆\?|]+\Vĭ3; K _}~NfAu9⋩\MGlfB`RTi͊* Vk9,z|Оweᱝݞrv`F&x\ ?%G$%qfk Rgn9:@JEIz݆kZ'C >fMNcR"cm^)Qb| d }::4J*R51iLZ*Ү͹9y]@lyWᖇ >Y8t#0YW4mD+j3Wmf Uu17Y2a Dׇۧ["pYKϹ%>^GQuC'k;e,Cn?ܟ?upl_V4(gi4Ҹb ?E ojAmCϞ/h )kCq P/gwƴ;5O66{nY17ڈ5=:2K~5RW^^hn#}E `p2d^ upV X+Q?ZcJ{HuHe(W, r8\B]&*q%~=*>M<岂o'MnE/Q40QK1]pL`LN \qW`H &{T> 0U1;ȁyTOнz׎{޺[[WpȍM,&ͅ+~ˬgfȠKz􅆆F74]Fonh1ONH# n?4ZI{y[O])# ûƂ eM"/I@2:=n"Nٻ (V. y|r~~0)z} R&(3 \-P CMd*0\fZS[t/؇5p9.mgRWdδ-_;~;XjO~Z@-0+^n:)7\cbD2<4 ب5IBI'HJ<מKL1  mϜ)OӄpX]qHZѢof/aQOk3,[Mz\{$^0'<^BɁgu_L|0amW SLCD=ʤֆC0퉬dmk"Aw71뒖kbnxYXY+',Í~|bc`+dž=J pq94m|4^[\A[Ÿm3#Q)#yA@99†yAdHj*ѓ)]`Bwk5رjcv]^6u^Wu:P^ 0GT6MՎB.T j-]Q vXNgrbYSlfܺo`2˦.4"ek4g=e̼efgsd\LY`j/=z!٠x,=~٥`3%8rRNJQ)ϴʤIg#렌&I)DJIde EB ) vÓZ<ڥv=J+/R3q+*[xⅴ]V6f:̩lRQCv.Im R\ r uz;"D[iD^AO"|XdW‡yq;ɔjZb/l%iw5hBI^<-+LwA:qȮ!+K|WbX;__@4eV h%HcJrV=$&')RvN7g>ӿW;aAp^4!Р͹!+}٧ ٩ s1].>{ۉ61E\z.8*sDp<ꪐ+B]WWDbثW/ G Ww@_79+7?3:vnMHrBB& *~` ~7t64$柑 ! 7ӻ[(t!XQӅ\}4u]M*kTӚOy5z}0q_[ϫQ|K7j1pDx,)hyZ#V_YcGmN:}„~š;'a,1&8%c%Z#~JDŇrr`V:1 EИo5?Z0|f/T&*2aꗿx86YH ) i\)o)' ^5tV5ܔ"V. |B<߽w/uPX\9=(|=8#RWDhU!W[EuuU\9QWRH44ѽۃ”?^ -,0PX|DcY [Dt{`%7GzIMT ad11Õ>1Gywl[ [ P˛0uq]ß~4.Ýug1iBaboH| 3ZFq0͛{=x |L_!N (jUo 5>0޿n[^)#!XAA߼̼ I@2f+yl􁸃%rR0C%7Qtz%>43 n;a}>d [={\]/ >WVe_-\wA]g*B-Id ԫI2!"i;ۈ1LL /C4hW5r|tu7.9.IӊoW籵> J ]hc 4}&Sv0iɭގ^ghy&ųZuՁz]uzGoLwSRJ͚3kmҍ<]']ۂ֐Yon_sÒWf&uomMdRjrLTx('|Ri09h<(΄ߒGx ;rR1^TA4&1Vg@^\K]p˽&mAYU/J,҉㎥RS)+[n,uo ٚqUmb+fm,LHO |`r.K5fe^ᝣ i5W<_?W ~}ѹbWݝmȿQSݿiU&dzǬ~Np׏\Õ12&b `c21{Wi*H9wrݩ-Oi%=D"'.90 ZY^Fb΃q\X P(섍ܘ`3`'T^HC2r蔕B`eX\/w(Υ@qV9敷7o2q8Km۸s#8ߩP|ggk@Ms]gbzz jXI=W[A+r|~.'k^ws i$HAirc Áʎ+*}q'B6ٙ(JS4Ұ0LbɗhK K}r.[6:,0 Zxi z5w\F"3,ȤbjB>js,byXV@^R#)1 ե-|6N))6d)NEiUeC2vB9k*{ٲm Av-OP'_hJ?Hy:q Ʉ޵6r#"i$&2Y ^cadɑd8OuddıYd+X%Re1DVR Sj񹇍@&Q$ŹET>4jKT! `(-#@sJ -@{Ђo`b"AdTHFy #gBbئ5W\oN W͗PaT},Ezؾ9-5)GGqfQgWT$Z:1;*W B] z*H'DD@3ĄPMdN<ĸ_B$|*:JODc[&J㬐)8jLxgXJ:It6y> Guw(T)W:p9%^x8SStٳ]ۮ9h)jr_z,J>tJPEKJ\}gj:tH%%m ט[ %M 6"vkSn? ..sE瀲jMgHI~!v=&K=b zAyo@Y"1Z(OJgǛ֮priQ%y$eojG~wZ(E3j&QKX,-c! _ "2KXGьKn&&Q-EfLM( ]u]O'J lVw,Ć{DrmR%SXǕ Ok/UCXrlCOmfݐZRT+oqANB|*B&!F΂ >qe_(S{)cb(z g:&Yؤ$RBJf,F/֢bX]H B½9Eyn|Ґy>|ut3 ӓҠrk)XcG*G;PJNPi|&,IJsΉf0=.CP1q&WHJDlH'&&,RD6cKYcA?.9Ǣqc6+Y[w\T&x#@!EuMfB: ?p>"y$S ڕ)Ff}X;r Y1F,ՈFFl5b,@"2@iA/eFE5B;MEK7=VZI;zI{4 VQw,QqǸxDm^'>2Y T“X LePV`h?P}qyű.\"%:~az* 9gT%< P`b2r'4@):6WC}m٧I52O^BWh|)ȹʉ@œ%uN$szϤ7\ BE#sYsB]MlW10i-y B\i  9#3Pp(ߨ\ *ዣ.Ah:m~r-lNnu놳iF߳Ag)Ukρ;WkKwn:V"u!EMGp)* ZᐳLMep6?NF幡Թ5iΤyb?뭺&O>O?x ! sb.ܷn?'kbm6:A竬#wlIGh5#(ifY>Fj+W1[Aw3[bEwfrQY?lM6UVQGf2> CjAdqJOhg09Ow:5T!OkY?y sdǿ::~??~~p(3gW` I Lo; vY;C$&;͆Fl1%g=]Z&JNa;6fkg& ?u~}?MJ ͢ꓟ|രӐ&w+H6$@wԙ]jm4fIX>僘\4*!uM{?擸D6Gβ3u$@͉aܧ`ZDa6':9ZJ1N* AQ# uzaZ|<#_́3sǤiLZJX K9$iT_H$#Jq`|uy,qxqǬpckr2.&=xWK_2JQb8>qa48Wa\}f^ m8Ϊʺ*Fu 1MKMWkYYHo=&>\}CQy=I{V[r}ʘ\ϡȲWXx|Fie-Bt:\kAozwkdhl}^.F~>`|<)p_qsZkΒU}ma&԰ ֘ "j<앨l}Jp+1I@7t U 8Pù[X*`N3GX{f8@f߆jRG҆/Q,rGZs"e@$PE#ʟbnH0{RiNM޲8 Iyxv Qnzj\JkI OlKs_B8u=ܺ^=.gy ~jb [βmΧ%yz~=]z6B#W@[iDT-)vMٯht !*s>% pփ˘f ,E7T /ҜV P^q֡TB)&e;g\١̳)3lpQ|P )WQXGQPQ:Z&cT`I=k ƴ[Lrܬ9W[νu$Ջ*Pq/"sFzdZq9xe#Ȇf)K(22PӞ)MX2Z[U,&&mT$rV- E݊Ӭav$X H3G-$Frij+N-Lp m(0Uފn"pRn&DPSi"Dmbp|Œ0B7h. zfB]VZ?V*ḿ :RErQ#ASD_*I!/cRCo:=@`<Y,Ȓے;Ŋcɶ|bKĶ"OǯbS9Ƅ$) plWAW:L3 .4XD(lڐKHyƛ+"R2hAHrp{knEe}kp?_g*ABŷGR: bFX+Kpܭ5[`bkm09@23Ľytg 20 љljc0/qb\'V dq?:;I5 4 oؔl> ٨ cۜ|kV";X$E0qWdE l2da2DM(bL R[mTZՇϢX H[XJL2xﱨW:Kg;:vj|ϡX p}|v6W9%.cO1bzpJ:~Qfg_Z]9~QUnۏ_T)/^kCp;pUaW*!XrZ78 ^+)'rO Wtb6}  +pzK^\)A\Y҂6}U*7jQZ]N7a|=a>S�w>L/<~8Wߑ= G5a h'RۛۂC0]~g`ZsaW`Jk`aJ`/ /&Aˢ}q408dE|5giESo4bZ@? GIl ~lfe$B39?{ۯUXL;_Z_]ovƯf6!m jQwUHL)TZ&8 z qK[gCءvfw\*2ViB8ׂ+ \Ui֯T) +>ȫGxlz*7z?ˬx~ 偕D 4lBd(Uo), ѫ k&(B&[,I&4F$^H10y-4_: LUw;<-ajO/_MSÈxT7M܌cWMkzCw~o?xHK5(;{U\3LJn>U)O}/qsF,EZuuZfvwnXJP|C_lHZ=OOnlyau`NE>@_Ϫo^$(8VJCWn",&fFXuE 9(v*Y 1v!A({a$p&Vg~P cѠ >Yn-%MkU%kք,f6쇉ms +w6_MXIqNF"?5}IkC62IV,9 ]J-e{,%gP_%b9 X(:n0%Vn vL1jD $dE'r/L!p>대XX>yL>u*Й83뀴MTiuhI>tIc/-ӺYY'GƔ?β1G_]qώYirطbGIUqWWCs%D`n ԤT# LT{JJdǦD(9h{, R^:Hjm#[L Uq,%0g$u&Ja"]Zf̚y6YOA*Zñ<'L:`A< Q-Sӈ'< 4Ө+tt4ltgqbXk@iT65X&/ )hYiз3bmhľQFt$ +4Y}jbCEɹǜ9R(Ntǚ]DK2E.hPH(zMgޖ1CL;nG33yϙ_wRFe.MLyM>ŮAºѓ]~ "L*ZK-WƱJ)ɧ,)mQF"PT<*#J2hud*(.^ՙ8[<h4)}a} u*{`\uꊊ{c4o8m<.-vV]kwsd΍f}{dMQ^e`>)#ATE\dYKnI%eWkG:}U!`?EnmP_{ r K]QUng;c;D?_M\r!#9(٘)).JCj,}PNs&=EP>`l% 5md+ST( 8U,x6J;ThpnfJBnl#巧Nsgzae E6_Ƿ~CurRXY9?L^7Uvw8}~v&S* T#D^kY+ ysֻsIIxyh6=B޸\BRUb_ DGJAkdL;_R ;ӌMPvr վM.:"M?ln?6l^`iO9bCع!% i91bZ8Ґx%t۠< 0THJ ڲ,dBv,> ؂T:uP^W܎i4K嶠vgڱ)jQ[Q{`i9MH@q5NYb2ڳ,  JBxXL23C E&XIUlMqVj(⋗( (@P@p~taQ-',jIڊ4-}͈VԌ5qnV+@7RqZ 1TtƓJ*&M O}Fɨ2FWHx|o~{Kfy^v.ݯ̇+ _LixZe Y=~%\_aOgy+߾ 闠Os !^1TaZ~[wo7?Mo/[@Jv ^TGJB\L RoW,2 h[Y;I0Ǽ5oږפU"\;-:!LFfMY_WׯIׯ. ENG޵6r#OܴGe`p8LApI]ikKvf˖dYjY-bz,VGw;(?)g@T 5U??۽ B48f?*\T!HY~KGZYeș}Wrv9 D:PU~M"jAm * .o;1G&< q|6"J7n>Gշ?Vt~Y? `Pӧ$3&_P|$H.Q،ۻC/oBEghz~488.ÚN2sW546clB@uT>mmf+i] zDl~\`5$_zQV9ENju,2).H_Tye<'*+(\iAcdO``iRe۠+=}#w63rnϔBZ0K5!1umﺹ >s"76X6]^o?>e<.u5ܺ=N`L6ܟͳYu-Kl9}Iw^oB@ots|VǗij2±Lhg9ǫ6rL5ۿr2[<7?on8SU󓉃 z*qpZT]{SƷq?V2)|n`cQJ-׶4DW"Qwscq'[7<@_WW^Tu/Q5|ZٓPz>u(.b %>3`)kLHڅh┉2#@'DLcI鞲On Fw;ۼp{iP.j6>+JN_iԾć7-0 NnZ)7TA҇*&h^@/`R %Τ Q娀hX`ZWyp;^uV)i2 e'!H ljoWi@c;hND%U's|E2O0S+iۏ3Z?>[.<' j=z#Kq˨#E; u%_dv]zj8rB*lɨ+$Sv**SUUR@ޡbv [RrY .O oNFNl RK‹$Ox4l]Y>o>nE ׯJ*v/Fvl_AOA8GT_X\? Q ={5=?oJ 'K0̟3N+ f`'VZ9)jhu6MɦSobFuL߿xƝe)т!9¦keѶt+r7s\C1i)aÙ\3 3Fv`a hxNI]e騫L>u#mWWJ;uՕ `)+Ap2 DBjA]!BN]Gu% aO3yFG~o.k\ \^ii) LJ-׶4DW"~8eS8ɷvlIE!KcN}ZFfRq`LTD8MhM!r )́ߜտ@7htVG>S%kN|2^L)m!/Q;mI`jjo{jg)Ch(o447!Ż)$]#ԥfΗx4Qw@C8cN> {^ZIǻ#lR딈z?H됴< 5%nБH)Xf"5[\D:`{Q#e 'ƃjǹ2D&L>wA!..9Y6Uʃw[xxufYhܣ}Q+K !=0+rjmN8DŽCxȅ1_veAI!$/:~PhBFxWҸ )tLWIi !V\DӢ DEY lq/NJTz|7%GUa Qc"BrTh˴1 wK'"Bb$`&`:? 4iFr oƴ0!+$3RD(Qg"\ }lIq JN]J#)H|I%K:DBqjoFxiE(E@=Izu6Ԕ[o?g/yutk}>P×__<ЋٷFkTP.F+U/5/?\\AE}6,nqqjG&[ȑ̷eGΩFG Fs;%WJrN?Q_GcTcߔZh*Q F\g''JٌÖզ 0!:JtR"Uj>N.}QIAo _9ЀKֳ͍?N\wadSV,n]LKcU'"0[A擋Y1 {Wדf> D`8 o68{kb>{ [:2~kgJ? YV|8Nhxoe޵5} / <Ξ9rfNxuql$fҒ,R%l?|bUjZ$;?K6}v9~rR;<7HſMJl|'#oǿ9K?ϿA*ן ߁k>|$)ޥw<|5FrՋ{{O}1ċwm#w@Z~/|z6M%WDmQqsjλf<䓴&.IR9!q͔ifBP^_LWv2fe"b³4Yh7Gh,`1 ')3%T-:+$eA]EfT<ڰyL}~ 1vJ]Zy CHűX1Ce02\ d#o|G(YgǗ:M^Lp7r&i΋Z\5v<>cQ$EF3Hi/Ez[>ˮUON$ۥzHoo_xB҆z ׈RF IPˢCv*SXZad%mJڳ^5i(B[D#k^P9$x1U\beViA8v{Vc8(X\ԡ UZ."*o2)f_ؠ`fN+F/MxOzܾJ_s<@#ZVv;~s,`uX`D TP΄qKD@(]{48B1 B1B1QZ00ƪBR:XBnTr9[S ()@ꐢ$@TSӤ.I#7\ ׯ# 9$0[K5l+0j -³Xx4 a8a \ Z5sI $|$155 A6.dGQrNpPYQWwA *ah&vPDސ y߫A}[;VfƃGlKTc lXמ\|jY:|G)bk D$Jd2!ZS^tpXgrUz(U!>AhyX4^3ZW 9,c{tN4Js!grE8>I#Rdc 8b6u?;{ݑ-NjzN{Ƥ*[ H:g4%ZiuB5j}3d(LYoC mQgw ]Fx t bS)G3ʌyI:)%ț'7%ٝ2o1SތY]%Oiϼ}]è`1EZ$/]{nN;&1idEBtMI;0/xi HrqGi9;}|[()Bdp 2l FɩJ,%*h#xPTD(XD*2&8q:lk Xal)WHmȺPԀ=zi!i֒XY{AU9Vի;0nPۛ={Ϸݾo/Rpuj}8U^N*# Y>@V=8NUZtCY$3|=8vjT^NkJ|>U (إ(rQŎʅ(Stsg.[3H Z5Rj~{4yOYLTJHH"8=;B&msɎ B6g ɖȶM .hѱ` ֙B@BRNշӒ5Y+qՆ6,ȉzzCWin}2;/bj JR0hA^v2sNʙLvqf;2F^1iA7OfkE*)&Xkb8rVKsPPDS=x`^Y7r|NeP)Α6FUX0g:'XϚ =BsFI!V5{l2%:>oHH8QE&Xa wOSe/ v_c19ˆ]zC@evTF͏E /~Bو04|P+^~'lFK Ѳ&CGf(!f|iٟ3T>DWDg)Fp +՞x%,`rLG6 6h-.gAtE^Vژ3o`e}ȍ."²}䮦Em}+,o`/oeyE^[&z<81kQ=ک vȾakyA/zF]CِUU ȗ[.͗%LHs㎔Wddt1}7{ U&u) NDTRˠCJؼŮ$k5ScIً e8GOA0y\l$d]aல#H OÒ0I I^giTf jKM6F*ՇOX (D% 1YwΓm3qC[- xiW;],11xr \%o9%}1B#AL&=Ίy^6>৷g}wh@`;Mi ܒ]P:`uE"FQh, %ܥm}V}nq2MQgs&&o(l v\:]4PtB !DtI8qnEۤvVs|rS.<ͬ|'5B:1_\z0~3!Y(7yfo}~lyEOlo_@Y~ߎ{\{@g-Z}z_,GfL:CE|??eN=b[߷)]!Ֆ^Å҃sL ?B#A='jn=w~}#4@xY {|gyd҈* s'gXQ(٤rűS" BE4Ra$cktYo bhjZH+|T7?:L7U~D(P+nH!I#Y S9GY_ے^Yظ**eĿXI]r6ioѹbݝɿa찹33lOϦדĉ+#SB[:`ߥ RR&FW晨R&N^< :x2K޻$f 6:KIVMVKEhp6@W$bD1@!I)J6ɳhZޒQ"Wf@8˯ Nכ̩ҹM1f]4ȺX{$`,ÄWW >N]Iit`뚂a?]˨DbW!xdgIkmʐcT#Y+6[dԔhJJJY Yі,Yed&0QY@5*sM CG1;DF @BծRg}r|1r3ٛ;!aSF,_ɲ49Q߲MHs&9,0J yeD2{%"JBJoPn=âM诧HE$0-[{I_>ˣhFo:R[$k1dGeL@J!m2'^jzt.x?.6nJl.f~N}kU9i 3E.h)b )~ T 92sL^uTQ%M0DnH(IYS Hrc9M)ZE%EO.,Bt m`'O,,zB(, pga*)J|!K<9k;K}e=Ӽt)4>+7UKF;Eo=fmYS<Y8S1R_8$CPRٔ}b F5 !)ښ9;؈R^EV㌧B^Y^U(*s]sʎ$@W<ޠ h^Xcɸ*.QNrd%Ma ec̊HM} i'NyNfEP?R 4DR) ѩQ ,%Y(ʛ XW)95v-Pv5xv`7q<)*֣D&Q?eNd2$y\UՇYt=iB&dY ђI GQEc@F5P9ak/"x(RR<L:8g3j]Y*Bg&V^ȼ\@OH~nJt,K clвC/6awQIJڦo%kU\ljԜ9FݨI05dSQA,g:*G,7F؋F(`^1CsY:t$![f+5ULK'F% {nuq4Y)GOJ"Ni).rV@P`5rv(S5z @s}P^Ք)x GLXRAk{wMVUCp;9Ko?_y(AZ) 5/ذ0Sb= o;mۺrgd5 A{&iҰXvYuFma M".;u-8Rz|T|XV][6)xX&+ ŝq>Dyqb8={MkV:ܫpA(îS6{r.2{,2QI*c\!WzU^@'gSouq7pzP*{W ?UFxW=k)Ezfx.ʮQF '"4A Yvs3IÎvz}{v>%W)ͮGwV1$C*sf#G \NsJƓ ,*'DfTD!ZS2:ORj4 69]"9;a'z:uyQ(BC[+FFFEFo9$4-^/]N rC3ŲŪY 5(2NX6[U ,'k 94R JAXyl8^pCBtI{Ƀer\ 'c%6D 6 Zs|1[KmM;w6 ᴏX!C&0F4$e $G0D'km/Hu)N-%D@ &0"28 :el[JrcPŻrD|)W?󦒜Tn1-?^Hb\e9 n>N8ћyVi5}$%0n`p@O{d9D;2q9;:FJ.(LxGG)#fl3 "x0f'3)VkZ!|nOl' xV|Z&!aZ)V-Ezxvv~|+^&+GS+r)~dέv_>qZ_|Qen> iz0o^c{T/g/>fkqDm8Σ|mY[Dd-o#E15Ҽڑ^?Y0sY0# mOe.> =sx6msT6dר]s-Qf5\;RF (]_Nq0Ƀ6LAN  ;>ʿ?>~}?v? J.$6 ~9k`|eo3CKv^,qP51=FV[n+@R~r8m8!q mo~ʅͮIiBd XtRTY-UgU!N|*1ӕT&΄*7qϽkMoS7I$$-2fp=0oJ4́l~4:2 ҧ K״!וD)Q"%1[]X=yU1eƚRʥAy[U'z[ YJ?neS|CI`$YdX08 *IdJy2Dkshѹ^rIf5,g9% $I1PZm dKn%HO%UrVHaD5%b̐S]_UWWYl\eC8>;I?V|oK8'h׬$AGF3DQ)kD>DS#gQEqWiZ&4BL,7JE9jiP|bV@)iz)|9&M*scfEfV!eF@r`:]>IB %K`HAΥǾ^nm҉mOW?+zӥZdۙd!ݛc,Az\͛ɥӯ-f;]g/-nsݦό/D o~j&&rR0)dž<yr76WHJ)NϣoO|U tT^M6o|c"^Ђ؆h%x)R"/Sb߷m?%go+Zv2Z<\&qaхAzvi惴 ߺ#.qkU"($ҙ!UDxP)Df jkzy:_y9ڈ 㹕Kp{u~W.Ua9-oM9O^+!٬ g]qɥv[@\.}*YH9"smPY("`CZkYw벥Mײzۑتaփ*QQb*&֖$wep Ld,8ƵY#sZBF>tp{]{u8@Y|$Ӌߌb?MHe7O$w>{/[ڼ> (Uhy)ݯƲK3bRx@GO}=:] ZF}HnB|-3¤m cA"ؐH?$bIgq #MaJ8%iL2E6研|b󉷵>A-KU,ʁ#̋5%kDVPF}p8}}|OV`t~n]O6?UA6z0[*0ȴN;BA3(y wl{FHe #$JP[ dchJ-L$\>ؘB &\! A&A€D6$K tiAM-}f2BFΆ"b8rԯ }r1}Jow0WL4aIrVhF`'-Zby:M6pcurB_Js{'w0|T6c*>9yS#je6XS.C>Rr&s/]dN]ٿ_F=QOjf.ɧ9ѣ9?q)*FAp 3!,K" yu@,D0֭L :pMB]#~WЋ#B 1B*i-pT 0k'"g#798,MjZT-[,G߷EUb[JR|38'ǝ/ݜuYMAfKjHU~ yLeH,#' : 33+%$Et+Dฮ1т,NR||.+ƶ|ͭFH?M\ˬA;kP=N>-Ϙ* d$)Ѓe,ZdLde w4eS ʧdSw[?\`giJ=$IXLHDc1Zlv9*N^ZP{tPeN %H#M]sa\Vov{=[de_[hu>', PN{֖-2h&uh!nr !Je2{D.K `]୷AJ:PF 8c[X w IIIv*O"Iig糲Qg$2T$ZӺ2Mz҅vB!1ZtD敱!rmsA+Lik3]bš\DCHԀSkhZ:hGW_2/]r3,$wBr& "ja].fp VsKj7k<cWmڋ. (2U‚+Apӄ wKy?bŘT͈;- ~ڼ^o7o{>h:U毌kf8OMh3S`R+(}[[Q !ϧA<"DmrϋX0E,^` a;YYS|X&i&DfǥeSϿ]ѼO|C;EYJi/=X`>.")[s8 H=}3лMfS>5SF޲oz\"Ch(h-+t^oIgdk`Bj컭-Tj8hknũZ{eR`"r<w1B3b:vDA)Z V۲:/=C˂뿼>^bf>_;`qSoXi𓢴z# ˾z6kmVsiZ;4DeHӍ9zrqrn\-ƚvk`+vIkvݛfUب9vY-|qDB=?_NJeRrG] ;2Q;6aNwOh"=(nQsQ2FDž=[tb vŒ:ќV%M#zqD?ǿIa?W-jR Me`#29m"|>[8b{pܼY=M>)fݠ;_Ŗy3zeQ|򥣇Hb@DYץcu?X&mf8n돣5h *fSەUμw_lhE)|uy6g:yK'0yt{zZظmmn6\%{\a,W\p=[_q.Sv=5jɛ{ mPd*WSex'\x' ;Y\꼓;ZJuW'"^ va ]Q)sl~ss+zo2L{~ xfoYXsCۺ@͆1Ix+@ܰW. [.Q;8UUJgFiIMYAwYG'zt'_5qnhE\Ӆ󽼁: X4]ZmèmYQT}>Nd2DQJ,Ep %CT@82Erʘ ^?fEeW:$ʹǘ8bH8>;{/R{!F>.)  ߣ#6l|FXk(ᾘe 3fG4:3iw%;yq }Њ=Qho!zpaAF22kĔ%-8J@v9oBu"Rj! R&(3Q٥ J Mdsb5r{Jׇ֯d ڇ>̉vdw^u+t+b.gio^ϼ6.PZh)L&^ $EG8O2D> gJ&'ЬnxgU+kw Ht6}6Ak-  UՎsEn ^kg}r\ΛώV^uԒ'6 [pE,_YT zrV;@:T(J}/W5Gu"#7)an zc rB\dՙ2يJ{*PnUd޶i}6<۲OUdxL;.h)|yDv -at!죅dl u5!S:$ ېd6@K^~ɉCBOBH02z ;k[Pӧq{?Dfi7?t2$tZp ݢIZ531lrXJf0+ g} {Ck8__TIQwo'䴍omrlGOA6,X{}gkiU.% %<@)Kf#`yӝ#9=+U,11^L0Rl(W8$jkj9RMV㌇B^YN>.\RT6,zf[wwC'ד05Y FWw*Q@$LK0gYVƹHP) Em":+Zب,`f6U9̂6j[c0@CոZ[Tڢڝ9D7IMT̺2e$\1C 7"T,Ko$GBdr$ i51D&(2IGD v[dc*P4b5xF:iĭȍЗ"DL >j@=Q̒AzieBp! 2,zNn51qܒA"I2=#I%5B.!=J95Y'\YKee^Vl-\/i**dy9bmL $5L(LFrHR>iS9:6['_yy#o5OV{7>xPKn&0)\ ($.L21_jG7Qw *UZK ħm?ёbQktI#xr$I,@8yw 80Np^u@˗}? yMd;͐%4\'%;P^AHd RrcaLY# 9d2F$)288Gh#K6Hv ZN C*N-FK0iNR J փ3)C楓(X`%Mf|V\TޅūQxurB_ӴK`ܠx7(D8?Oߚnt\LU) (O0ozw憑vsYpD1B μ@C,Iɋ'()&8;p&t'))4F:U6X.WIL\@,ɍ״nqc4lr>,瑋)U;{U"mv͛ŭL G^&`JR{V4oD,6N]9qV"kǫY\wkNֽYbo'?~v0[C!Ĝa?ڮ8zˢć-wF6ƑP|0b08+vUfyOBp'Z ׋܌?OL8`G]LrӨJ*Y娓r2R>Crz+j㔟hvA?SE7Dn%m2 S`\+w]zl&'D @jPvu3k{ZdG6;ޟvH-jۖo|Vyv~(e27x=Rf^Ioϫ<}zwf%(xɚ/|[jn)x{eQWt^-r\]/F0G_+|J[kH H])#Ɇw̛E_YJАw y(Ll#)% / (d8(W@hT癁h%xfEUτ@!*`Ґ2ce JR@&2q.3[3Q==cBs1}nC**a 0o0!~.m/>KځS:037>;6=|߿i-}I/93AoI0%)E3J?$ϡ٣\uI4B9wS /Xfbtd @ eʹ6`EBFys"\I)E3#Y䱼\ZNy=brDeP݊a4z @9+[eB|8/'uVwPw|-~0fQWl+`<8he9IRy9 TZ"Qt%;SGӳةgʑ_!uܒ2Rvľl xu7Vygƻ9jDv:I-[pRU[0Cm!esg+|b,]"pe޼ss]uF211gPƷO,A'ݓx3px 跕 V>H։ (x5uUٹ8Nl|֝yBr[~ Wڏ\h7逘h4&!h|]ltl9)qw%Y\}gg;\^s͙6fx@nnkfW$7Z#u[ WbQee[zqZۃ,74N>nlyRz'>o3Ǫ\u#P\_빞+w^Fdn9B}* ׺|ڭlZ5:&';WٴCg\g̻fn;ےҴ3SϵV*r$>r&xꛨʼMȡH tJ4Yb#j[j:x9s@fKM\OBR%cQ_UG4*ۡ^hI.4⃚a$=f2, #*vifK)гoŸ,w\x=XSKUDAU"4 >hZЬhl3%GNX(rI0S[jHJq[ QIJ3llhg׍0<:v8,9Zs1P29S{ECbK2:&Sa07ǙoEW@N?SIcA63`GՠQ$,7NA/?]8)Qs3W?}T|:I8F;Н}ħjŌ~ְ:8+("_ 9)͚=pho.x%M):MJ$)FT)"{Sj+ %VJkP싰8T {koG# }+7lk|<6u 7"k#SG zMI&'5@7㒌] 1lk~f&XK:>,\f鿀%i' dgJ6aezI]ߠ\]!7!Ғ>dCJO>daY=7¯3də,`}6VzZin["|N"KJh(kH&)lj5Zc5Q5r!Y#q:"r{!jg+0Ljts_$|}Σ#Ĺ?B/!kH':sQ~.[Kxr v#C{IM>3Eā\[8·vqV9_D{aNGY .Fg]t].rٹPpGMfMV&k^+m&ycfɛkkl|hReN4kMn=wX uM1'X =d7_oq]|ncC7:|u1~Om$/{O-r×GC){a\/ VN<3޸r}}7h0 oӄŵ)d* UenczD~:wmQ~og`x{f~".%Zdri!ָgxaxk[mnsG=zdO6r>4_k*.ĀD ]K5U9jY{ 0Î-f<KpѶs.~ܻ?7+Su(!89y(JQ%UnZSkŎ#;t97gE߯k]0hD% }osŮ;}|bK= Ug95/ы)F)CZ^0v_jKW8g+&-火Ia*>]z !p/*/ Zp5t3ZJ=ß~^ ~^. N?|:b6 Y&2]- iμaCvJ{a:8c%.?~ [>7˥ ~9p&3jQ:`|`j`Nwse> 5?;?_(jx&_/qk7 " N* M/FXG,ٯ+Mm*oRb=0勊݇r^_ )uR8\mޓ2[c_\ =ʋj>}!|puC>K*LZ:Ye7pxHI w(S3lwt,igkg|e1N;MNjONS;kD]^B *6s$`)coqO7n<}ͤHP[;A q.ӻ.|47Jvl 0n[1;4G \Iy\[L˄Ӛ'I?,[癧uz{'Du ЫϡڒcIDެD겕,dr]bRl"[1.VGJ$6ݐ"aioH՗p8: q9cv- 75yr5DΒM[ΰ^_kۼcA]Nmld`16"osrTrpa܃ʶ kgd[>W{Y6ZhbV[X%ʵ`1270[ՎBJI}ѷpRZLm$U*rQ9K{_jv${coz`RV,fLWMBj@1csbQ z-.E%d$ GpF[(3+yBC5@I3f ]B؜!vmra\r6玥X&CG 1Ұа*OC/C bևzhh!8}@`D<:YOX0>)ͫoXkٌ  4RP;wMAq bT \)SsÜS桸}%C ꠩Xyˆ $cqdb"aBh<), xEŹFfl2w%LTAzD凜%8Ia-EH2zjZ@HAwVr 0n`n4uD`hvo6 Y 1Q?%a 7pp`g#W Q r^H"#Xi! TLgƀQ.`pO gY6Rl arF@et%8'6n֐y揷 AEnNah,9Q@I<`B=FKS(M&23A9KV^b^jtuV8'cDLd-!`']y'НL(ǟ>ű#}?^ĂWhO,. Aт@-Xx@X8piL``Ld)rt%I1U*h`: YE"՜@p ȄO C qkPtZXXyq`4/1 rHӗ]eٻ6$W|~ȇAv &Y5HEl'=ERez-*![uVH#38`GmƂQ,TGWZbYьp6YK1e8TxvcU0 v=h5]@,M$C.2The1h6F4= R~ R{QyH!%:.p /s#TϺ[j@H@SAʀvo15K p[O hg]+y 9.z-| bhw@[vThň޲N+}Ƃ0-Gصn,$fj2RA ٕ`?Aj880"㬪p*y`* BȲ$@3܁:gI)dlA'J sAn+it;$V̬$޲[PRSң*^`~FcoT Q4yf%e SaCj)h\n8h`tӲаxL;?M::b&ɒjD0uq;VD6= ]Zp[ n[})8v%[Y:6T=kM!JK9O)']#(Οm 0cvm[zI,WR S :( ,3UbFz DekzD-EcW,B?va+ v&SK \s[)E bX0p0) p#5uŶQDFb1;L =ы}F]Bp?{mDVc- ̩m X{ulGj҂c b $_T4/Q40-l]΁?!\뜼~gA^diN&zf֫Yb?8Fnq Vp*#,]A-lZ`䋞@+Bb4S"p#fD%q{*5]A=u@2\nqHp7^p`"n7gSMhĪb4D2 J.ȎYxY 4I򯂝H2bNC(;)Ւ0[,L|W5b =];S9зL0@5W^TӴ.kqlv/\!"܂dbR`T.9=~fڤTjH7pirEEm6-kH;1<TI C%Spd_@VWJn:2@ީhY J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@zJ (P(`}%P J@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X JRъ%:VZJKY J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@zJ #% R{"ku%Y +^xn +X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@ߎZilx7S*5^׷UA Z/'ev>={DF0J?%?0"kwFd,\z ¥φ) wG ao径+X ኬT|k>+2ٽ WdntE5bY<= WdptsZv=\K WQG)\'\E#ė++K/ <=\VsfUW¸?\=ਞ;\=\7\=ZLkW2 W*rzKCu{e{%\ڨĮ+R;W/0\HA\bں&Quʕ_g_O~ܨ3YԘ,p~YGb&¾ГG7tvܵnapV߱wDƼVᕕqˏlm't1ido۹fV%ˏ0Nf.>ѬyV(+HL9sK1H1L/SZ3V{䛈l숂?{Zs*ú.!EI-Je0! ֫s:Zk՗M^Gh./t?Mqttgwwan vR޵z~:or亍m6Mok z7}<\y>&g\;r/c]i޶Q]N냃$ot٬ :߽ΪhO"QyU6Q7iU=)z%ۢ1>Ҙ}fs/h\VMk+zr_-V͗sIsE\x埫/Ot LsK=S4{ǎ郟ٻqIl%/6 r7& N`Ur U\.3]{ mx{=.oW-~Wӏ.槟/GJJ*J{m޶Ó[:ɻ4G#:wxCϻp{>n<0KgqGpG1 \b,&4yi=%M5_ބkL0lڮ΋!7.>J*9 tifCv g'G_V̉$ׯ1-V Z TFxFGz]ʒUSѢ[3T-Ro'8/.VuX"TEC- 2DA7F~ +A g;Ha1S-|Q4FVUZ,NJL]N)v7jf`s4SҘ1:HcF3nC1t*램m0[5!/']i mqFԌloE60`2FsN>33w鑕,.dvQK-*}9`kmJ9+EiZVDccC+c4y#AXK::ȇӛ].j5ᚩҭ5S AES \)!Qٷ7ތPPOiw'ձ~۞.yQvܠ罤GyJМ6V~Bت1)ʦs6ޥ5]w1xV| m[W&0|\}6J(.|nJ86a?Jj[Wo&/byZ6˱y3}?}n^dn[={ `]}1am.nbG-[PXmIW[1+*}b&ml]z .]e z,/=SJŠޕA9響^&<ӒVipR(ZSFʤs'U?ʾTugkTD{*UmKGoGuOz 'Wj5uQeU޿{7V+gM"5z|E̹{1tcjCŮ釛A5"'5}.4T|;mA\ӎSGe걒pfWRȨCtkjHZcu_M2r˝+-] ,Ɩ] FCw+(v܄o<1:9:Y3% cPYɩ<p!;&{&g*eVP]Z;BG%Xw6 dʤT\{ut"eT1wm+zT8ƋGs7P\`xv|r6"ɍͷ/2YyߖWu{3wO3B϶ثUaLxR=j u{Ӷz_ڶ1zVXym}1m[uR8wͫpӃ> k:X,TkIEi '`XqXgb'Xw֩ SU,7bS|r(5R[uuPvXeȵ(]JFt Rqd=9w;ͩ H5إ4Oh qprzae^\']9qC:?{gdO+_kK MK| U"۔s12܅Ǿ\/W7NmKSt-JbL25m`\01Ş=3^)[NT%Ns g}t6;ۤ?m{;^vGxNXWB3-aǐФyU<XmnT"( $jҦBCJ(~ B@"U5^&J$8 3[Cʠe(׊$Nʍ[@ns B2f#u8J0]5.5L~#Qhf#̹'5d^v*bOntS4XKWk#`HV2D/|0u6+05y0$0˞k-VƝ̀",f@Q'_n%V[REf2&wmm8Kf`df73lfg,YJr2Ou%nI)[N:@l"٬_*#A` XYQEhIfhi~r}P2df%5d,lr}y1삒^_U\ԝv YA7*T.ˡmW3#6ԁ,)24'#L+ XFJɌTERV0xLX71G@M!EMIN.K2)34vbZlFp1L̃ Xktڢ2j{6ޮhDT6.ɔDG Lf*Hv-ϤdFɳF*f`r +eE>&Z$aQq$ >D!{ٌQjo4.XvcJ2ő30 ZhOQhjŦjZ./FCS~)v1=]-Q2˰uIcFskK&t ͜5d:ud:TV(cO Co d!xmR%PG 8cXM-}| (drK'xv>+:'N*ÓS)3d=XyٝA+#tܴLd5BB.<ƵV1✏6ܤ ."š\;"=BNmTÒrgjF3a|:MJ0Bh3Ζ3iRJZA .juq\sz; $,n]CoJŶʲŠ~9-K* P@a-q|7ܦ;rIn9 'CnuU/w?Ky1ǝ 1'Ǟ;bnhx9+gxXUD, ]7.t J砞pIL-a4biJ<@"`o tBJ:7l/ͭq6HkcI-CC:HnQ=G\ מ iǟ}Ymŀ]L^W](ڍkE역HsݶRG;="1?bv(PJ_J7뫧O7T*~a4 {J6Tryl}ju5\X٢[%׏ѡ^H7ϼ͘*\TEB0{\kVCsˢ:v=\rsQEN[BKIS?ŧ-m>ƵH V7 GP7hIRl;B{k`wCuW=1< ƭІ.?}wPe]˵îhJk+6HY-;`󹯠8+#GpWL&Ee:fʞ)A{D "=0c6FomL(>fAPQ\K+l fE^xXOp3I/^<\ E{W+% ! m+6ɡYP@8ppqI%q}.k<$A|8Z1[eRF27!K/MLMr"OV/ M\1ϓ+fZhyT'Aj%DȄ'+'9L2E֎ A`I%J^OTO7I,$Q2#+Lr45;P))O!iЛMÃ8ϙ6Ii4ziej8")\"* reHzK4MD1r VgMO(E >e 4Y\42f"#NDj4iG.'& #&r9o+R]Ee9 bd0`Y3ZdD/$D8l^#zxwD4<~,|KNKV, ~1u'~s)bz²4s2g~M~L:N)5"Ց!2>jC \ Gq!4m@iY@p-eA θd-%xtFGQfqosFU=2O2ȌM+q5?,R|(#(gE|Uzs:u1].} "m}t;N֟/ ҖGU)jrYH[)G\y8Uַ|ȮV ҫVe=-u4hbi\Ac~EV5jW6opxyFG~|wO>pa?7޽f?SҤp%OO_47?o-5yb޲iia-ԋܮP;B}h֘jmH~~N9{Gh5"Uzq|{~8ࢊVޕ0noA Bx_6 +K&2,;PǼkMGH*|6R"~-2d0WT9=ZdduQm^lXB^jvF2I5!fC6c6m\A(SFF48mtSeO砜d$v}mChU2n;4{KftL]<̂Ҹ:MD 4Y# xcnW)L)9meM/=: i]ˏ˞hs;p F<{i-Xb Z\b,C2 / Elѩ %lj01b+/4ɿWb|HEA >`FfZr46ur}׫?/>;Meq\B]n,KļdFIe}[[dm6$ͽ5Ѷ { 1-͹R/KcJir+."tMI4 YlQР«DfW=:H׽˲*TVl [&|֐ػf^WIW_Zk{ElăVsxjmCe#1ƃZx.b^txW]+XTv25QE}T9^? ]<=~9=ўJf,"& i —DNk-J|j c:L1THJ ڲ,dBvl> ؂T:tW$PPYx,jƨFڬ:'~1DF#PSLf]+ =ܺS]a"aQIhX1y#23C Qc$*R-GuDLȢ\9qcq}GDqrceY'D2$,ui"!dALAcDRL$mkc2e`.ɲTIdcKtthʜ6ĸ85f䱸h"8Fm+h 5k (Yb1r%B VH #.>.F9CC8<| [?~5~:* asx"!n ~`Wsᶊ~Ʋ[^p"aR&w,UƗG]Y=O 3HPa *5 A >ʖ/>* t%, ŠBDW bbNc=mQJ@SbF ĺ F"?$"xa,ۦ4 2zp<%gP,i!Kx$`|̲NdMM$my{m"$b GcjI%\M $eww27!K`Tpߧct XIN~ua3Z 1R8>K^cenнsKgqbXgAT65]p]4xf1Ҡ.o:tyb@ %I@Vh2!>$fTASR2/JŘ<ΑRD4*l-hDK2E.hPH$}|E 9;.GR|6jH1<~ΌG҈o7vCO4.~8^RhT9XA"FOZBtJF(T:dbJ2Ts-BƚD6p4HSO1X R M$xTlF)eNUPk̜ z~N{*bp@W&▀q"oϓ߯6)uODxAi>_BfPJJgM4w>jJ"נ cN;Ih}\{Xpu W,e,ƨ(- 2+Q7 9ƁʚtIZ2b똱fFra﬩bP+3z:={j=v/S}E|[`C&k388d=`@$F-+d6t:9a3T=@OtaȒ2#7x"R(3¤N$PJTڅ@Nbs[50>4!wݫlZ{R~y'AxPfӳOwD<adR3K߾28L[lk`fTN~~EK/us7jgCXġ} ?dF_j5G'ǟGzὨJG<b񁾲b\~ b3:^Ӈo;?yr{H{]CPz_7cuN"]>na]j_ uZ$\֯ N~Wm;9&ˈ [<+/Jb/c`V}EKMMC|dwy^wA٤eܣk{K19y7]^]/*Vy6Z.o_N[h!)Na,ܵZ+'G]юHUhl bMMy22;Uء >zACQ] ɲV$8d*b>Bg5X{O&'{bskɸD.9)v]8U@ ^U((-d ڠ< *h&W{ nNχRYxlVѸ~x|2pXsJ5[H / ScΛ\VQEU ܁*aGUi,*+ȅ cEuj< |BDJ=ax#5,]*:jIXX*DޠXLk?Ŵ5gtLYmw\#  YP!(0vBJ$-p/$C"ɵn+YR]W%D ՚s$ٻ6dW7W&AO+THʲ~g)qDY݆-KtOT6AP4@25 ឪp&ѓK+Z` `@Q#C)̕!JH*{8D_d3UwLU Q\[K!`D @M-3ScQ zu[RfEV6 !-06Qdκ@@20$*vR $`v(?knOLkE|¤Wgg` h75z1^O6pՎR3Ż_VIK0h8(OA,C {:AP.Ф&`X#2`)"WL J1uf<{8< }]6צ+a"L6`My\:$UX.F0n=wڜYvVPLL8,>F覊N]TFWӟ@;>>\^-VbSU<Эu49}1|(Jͺ6 ^>qG*BZ7Յד'l0 si>;;ǖ껱 dg'fW $om-]5CZa63Xނ0V FN4~8Nz|f7ysV zi'Zmk;V):|6q(HfbX5ǃ*y5~XoP_.\_xÛ׷O7|{:=ӷo^ÎIp)@"l1xO'l47m*F wfW ->Pisnl˭ůן^q}`ƣWJz ?HQgqGTRmQ*3%W&DxfGA%;U^ē|$_S)u@N'i 5Q)^FiǃzhtajJ{>bxmǓhF >J0{"  _RgZEBNw:;dٚ0ɈnȆۼBO۹锶V崝IQXo3gOkf ,LO^.rxLJJ-Ѣ2eu1i9;uFɜKSww8ڂ˾~=7}P p1T)]U逰  6|ws 'z \Eru{6eܺ-[A :\vWنUWU~\}LNjqU]LGqm|䷟|\EWa^f^{@z-;%Bw| C#9J͍+IZITU-]  1x2 S%O>ഉH1ɩa<1UQb+慰Hr5BkQHZ鄒88/LzPҠmPV# $2#$`հ3]5t%_T#:pRea- g5mO*4 JJ#Ũ,E`PTNQQ4cRF˃ JeLsK@5J}4X~sYi yEPX_xbBp"bI>ŀ:3Ƒt,aDHD"# FT8G"lHET"q_#iz֙8kU*Y&4J&cB^2 A.K3aL2" # yPG}U`;Pa_xSVKeD"RԑPEa9"ۯ` )maZB-7g?)Iw uc hJiH_ 縒cLCl8:1u2MWk+!hxmQ pBhјb+iB^ ,%˝`D2k[w- `ǻ|-Mu|S5jxqQK!t -k II%F.Pݩ ^wA;:[!jX;jVn~DLHqӄBq3:O6Q5!IXBA rSlIu%1`\TsAvyk6 kjkv 8wv /qBF$Hͼ4ha=# c )cB)7.E-udXDcʼn,Eb$0Ic'w&jh \_C8|c\/]z1g,^tQzk#C̗Zr_蔨"о_$i9$ (P3C$5!eՓV/Z-Tq)/h,-}j'K$պjzo6WL=VrG f썹Jb/*Iu{sdZ ~4$rPCX u$` l{!x΄?og`0Huk^▱噿gO؏;&V'HtlCstxpwu?5﷦a5JmëMjfyf 4r6<e,!]Znu '1beP[&%{Ay$`8+о$y0"o S)#{d@l]7WI\B\%i{/\%)wH6W_\1Mާ7#,{cJj 8iG ƂJ*/ %H캹JRlEs%P*$M}xTM=VqeF,b%A*8uW{5㹜)XeVy\..OP:eJM5./G\:I| h4gaFG;آЪXu/U?O7VoѣmXjH W~0WbHڣ,((%<(T,|T$6“uLX9x"CXF*|wPT7ASy`\p%6f~om3oS͚me~om3m6f?)f~om3mֶom3m6~5cUǞl$"f~om3m6f09܆mckMGFKPhYUuX%伸T`.#7Mel&;ID *8ʝ7\ 3f4j|1՝PFXp4V׆Bz 6Lx ߟwaK GWVކ"E< ܗ6|BqEigomS_`x3h%n%.Fw7hxu.LXچ=i3[GÖhS0󵈙 ;wHqpaTZpܽݽtkSXHlPoaR8 ؖOCKK)FDwAs6oj& mۺgLK"-WTqDF N :<0oSAV5bNhE4,9 =!"}dhp8%b"-%g;Ja$8֙8i`G chABJI}@VEόsM)oa1̻@Hia3a$š*-gA{5EX6: vAo_ozS9-ip!1Z,ywg 7:RuT&Yo<։Δb !K#-.u F Y]IdS&8޵ Nv'eb#Y˓[A<0aH ȨN M^H@Re06R&E +7ʂ@c ʽјa[0 Ibg]!hjsgAf?t}{ŗFp8ϜuAlEs11^՜yO0R'q$-׻N񚤔[xgZ6/fpw,h %$;,be;nI۶tV7EYsb%C8]6[[VdMP,FqB="% `]୷AJ:P 8c >`ޕ8u?'_%l9 G03A;}M!I_t^6J 'kWƆin "XaʶtL kr&Bdڪ@%uI`_:ߺ`Q fXH6L _-g F%b`5:,\cc{c5d_׭k6Ŷ#mhWlX w/gh ZR'.|p r5Su@AUh$NyH`dRR:BM8Dyۤ/) [r.4>y,?+j]Bn{tL$Ihp>+gZ{ŅpN;D(ZX+k,QL?& DD ~jH 1\`Cgq #3%4A&t"eA Wv $g.mx=yŗvmۂO`u9< {3A>ę4 ,&ЭxT ?XsCER S;BAg:=,mޝڣLъ =2H"D@u`N9{DE)1(bB̝"D"Ȥ5HAP$܆pq\8-<V"۲g:Fv$XOƟ > -܇Bϻtvr'ts+_NeJt|@Lc1z7(dbA" BfjuzEw+:r;s%pNE-^|HhRD9)3x.Q^\% yC eIc@f~& .VߊzyD$FH%JfcR]CΓI?D'%Ko'Wgg䞹:.ru.Gl|juΑrˉ콧 JI+a9w5:#U\ W5s 7v'P=NuRF˰뫏E['k9d[C`mmiE,32fLaϣ޹=r t}M.nKNOG8ϲQqs+.?$C0b`^XG =&YJq2xod!rAP1qZ! j24G0D'cLH*}ӜL[J,'JY31E-=mʓdl喅E;/-5WEڹuتI|P|s񿎪je8~XV4<xWߚ.%^LXOX6pTf:DoVuRj^1GR .Dx.4_ WĠGۙ5iI( 2Lp%h)~:ų0:Ջ{VU428#c<#Df,-cRMWx ͊gO*OOc:usD^/?]ztN Ts m Yqw*EM.kU&ţdrQodW7(]͋5,Q6tqu`v0Vs6œOQNNV8Rƣ/zCZ2%Pwt֌hm8km`yGJp4n=,> znsp6mڪ`[]vնI6z괬RGJÊϏI,U_L窘C~|a_~VlC? FUp |w {{8'0HA?X$<|{MS{˦^/v.gCs7dvWSn@R~(}?,Ѳ5±lМ\WHlRy>*~)a^3%,5&Ċeb#xSS>eKHo&"_@̑dVȐeי g8&HEVL"ʞ#=bՍk{~]^R6QJjtCO3todقM=).Ł1qO&ѱ}>dgȜ*aMXջ<ێeSWH+<{`l^q"KFuu(j,(+tAnm\AGzXêjn򷍣"}02 h N.?M:K*huЁRr$}m,+TH{ԧSKijI)Rz#Y[l U0$d$5,9(6nC@१<2՗Ǔz.MJ.-U/fݫ.ZȍM6&/@hsեvz@b kOk*7WOs4R̖Ӄi6RɚJ.C^Y+_P󵒛C^fOuw;NʷTHnN'nzSjh.,Rf<5?;f}rZ5̭͝c;5m>5HɇV GPшHH=Ssv}cx3d?H+/t2aKd'/I%֚Jpe"m=Qpt 9ʄd@M %!2\8-JTo%rzxe3qv\U3)SV{&ZՄE)ٲf~aPL<9o}>Yyg]%2#sWLE &%gi+ &%tk3oEL'U6iFo(.ܤV҄ qӄ~ [|;cBR9, $/m1Suo* %E1yʓT)ubY&yu-SO+ec<}*KҜ/>av~'boNg3Ňӓ0'd^]Ԏ/e6F>/Vo;W(:ՆF?|H?-!+JRU,#!Ҵ-Hu,kT02?Y^7u[},cu=>XV_}̯o}{]w躸~0jPOmpׅ,{r(e>mgiİW|3.<|=Wet|+Le[̂r6gA7Ϯ##]+|[W'>H,uE+eܩ9q̙Utb& Nܗ=i֗x]p5VCW@VOn=:h؆ѢkeDX"<)mQs `8QtV`ho"'JFE"oySi+~Ζ=HO" ЈsrƁ(P)e 9BȁgzU@h`ȧMzYTӍ{YlreAM*crJ !e2-SL#AK:E\yH{yHdq7`WoŋՃ ^&P/ WVٗIiu W0ա@9&"dGW$`XH+XH\F\hઈUVUR^#\q.͵9m,m%ZJvmwznZ]Nxq#&I2o7.Fzz9w(*-o>ń_(k}pgbyqߒ,.?rb>fTEθ~$ov+s؆?_ ha܆bUmʺ2z*v+ՠp`G1g %*r},H+{)8+yDpE[-#cGWEZ }"*J/`$7.("iI)W9'6-q F]%l~Sw&uqnPiUpr75gлGy/:;"xsĪInj©3/ #N[jDZ F:I AR<ѡ>0:e%gX98oq+(n{y~ @|w[i5 CO Wz֑/{7H̐*-U*VWYDf֢YHJۜ9!A=4 07m\r{=tdn<lTC:$<S7]R5)Z)Tѕmo#D6sYѠehc'_|hYȕp|66|Lb>&2 ѧ4]yevDsBLmmoCn{{zwtiQ~_N{XTOn|yv {:>$s3,xJ,cuY77)rETz.Uݚݗln~z/D GOFn{?m^% 'Lɉ k+Yu!7zZޣGr4B% \gMN #c6s0z<5emTCɻ*%cVVJqX~ow ϟ{n?)Km̳E GE!Xar.rL``Uw޾%ƿM`tTyl 4wHfã*K+1@(;mQYQY/QpaAF"BI5bX˒:/Z;MNSߊ#"*e&I#^@2pC Ldm2b5qk̽ |K=| f!6t 'tz!6򜽺էz.P[dLN۲%かȼt}fUr4*RcǪZp 7Y,ցKI8+IDgO pD8A GFs-8-^|6-g}@]:|֩W y(nݪY~ťC&m7eآ Pr˙EBq(4֢h ](bp(_C)UJ#R¡D򈚑ta-9qN/6^X F@hJArPn NDrfKNG>YVg$+LGCupx]nY-~&Ɛ4(%9vQh4FC"2hE26SEmH2AK(xГh.0AȳʆZM݆N4oxek@Vӓco;@׷}t-vKRیxVHǏe**^Q#_IȵP`!Jh.roMY4sb)^ITv0Xv9.,j.s}ݞ®wLs5(9 XF=_*Q2G|p,]$wMX.>ХuQ9m-4S2I8=3$F9%1Jy1×͞ ލH24H/aAbH]iû=[g:T^q93z{`nIV9cJ%ybhn3S>i\J955؀g2\Z-׸gJ] 2XJ,%F.2DP/l]&L)3B"t,}w٤oh(}Ь!Nk$S!1WrtKeS#Sd|S!D rKʃU4KkMi Ym0l RR\sHؐѦ`7 *yN;Pi<WuګM.Uv1, /{a4c "emC#eB;sc1 "O5^D#CǗ"DLL,%%VHheJӊ`9DFN -,|KfC6Db&yk?g6̷+~#͖Rtb;O?sZRr:a2Ju{Xgz8_iΝI/Gq 8}!*qD IvTwsTlYFbuթ|Uu'c^)0!φ|\ a 춻ɫ Ƙcl2#IW\KrO0;u>.y) }R==r-L?7+rϾmGWbxŌT/=sPvuiXVtŸ[a,<\* rRWf+3qe&(E;#)rX`%oPbSZ!bH톙uYA+68G[e#>]WVy텸-JtnBqMBۻNި]LxZкjw9>An_͢ChCy8|n- iJ*E7vrMxQ5:sp`[<#9iSŲ͗ _-rB #\[79SL eR' >PV4BCYJLX *.%+ P'7-, 8uj3Xq$qC Lb,~хn} >5h UEA)1 z)%c+CEɄ2^K՜iN-Ҍ뭪F *.9Ldzy ~Q xTXv,~a76= ~㬭c,(E{f51G!>ݨLFRh%uYLI,:,!rƝb (C5R"s!Sr`D20UD#u8z* I)Bd =!"=A)D6\KI8);Ŝ(aHˊ5r8Xo; Y3J燎uUnsoO_SnWLjx_3}>0B1Eĭ XR+$Q &QiG&kߩZin/$5L1^II[dBԒLpB=9Dv>y\7nA@) kzm$5"mԐT؈ ,4Ł ѥ¤ I'IQ9/uZlSpH`)vEmi͉l /ēWw_Tn_!@7%I)mKx/|GES&c'AiEܩ>Ea"փe*1"8WGfڐRBj$7RA`+jʩEaF*v>74{',ys䖕_zaMndڋ&=99`?KG449x6 q3l4'W1S{4hwUr? 咚Ξ'{|oF+7깉Xuٻ=PӕلO(hi!YB;iqbqkSJ>nqlAƵA鞅5"°++a@ Iݛ4B@? Lr} dvc&cPi ԇdQۯlQ׹X. [.VXc*W}]בF;ϾΝmr`ܩTXaƖ am-=zQi[.Uv zGB,zon=u&B4Gau=V`EMTFbP\ x7kRXegeC 4z[׫oҌ YV5"55~؁=b NwEie)ԈsCg+1j7 }(j_[ نڧ)MLa~$T'J[E33xҿҪ3TyσkQ4xΠo[^ MK6bө|nq>bgi|6ơtw{r[_e}}i[bbR!$vXPF1))i$;9{үٳ%Y3b`8-Q#% ]jG`yʄhj1#)V^@iݍ~Ķk'-Iqfw=>ڿAu gyQJx'=rPNzzg^} 9YT9+b]#^!I]/Tyuvvte]uݸ#x`SAPœQqLaqi#5=1´U^hӮ0)䶻浬hTW[O05"v9A+mAqhk \x@x)uufaZ ;1HitL*"R 3Y3~m Ay[ã nWy, TOU2ّV/-C4XED 0W( Rk $e Q#_w"g?2X!`D @L)3ScQ ru[RvԮ sڼG9LN|@iKrF( c ء6l|p#f:ݤp aQYJ]t~.=9/j@.T4y?>K&oBVS9B-ziyov;^+cӤU)љk hj Euq$yvG\.aTlmNUlcqDkͻ7wl1 ^sk>qп.זڮdYQWo0|Iڑ(bb3w [>S<>xa5aGO+q(~(a*ZQ%*Db#x}k' ?=R|M"F QkaZ:2@kRӎ%n*//c<^;rg#(1ìQ̎0{" /3Ebi!EwʝQ9{:ڮ_w!d΃I' sy0Fu %{΁,5iÇZ(ڜ-rm|̓W)] ^JW{R*\d##-ڧbɓ@+CإJ#68R$#Z{1[,(l-Td=P:dgnt7!KW0P: K#X reO"Z}h^}n~ G5&ѾXa;:6ǒM|1oj'9LZ^Հ @XƶCZ/Uz! ,'?[TL%sLǘ?E/\Pivfk7k0/kqX!/dԒ 5 A{64`NY 0 JqTQAZGEdQ0A[n\$JKP|"9Oȁ4.3MM:bޒd»_:bn¯Ds4~l]"ࣃ/1&} 24rPș#.נsV1»x.tD uvuN)USzh&"$nGEbHr1E@)Iya`"ZA۠FH"IJ2#$Šհ3mͶEH0c,}PW\l9-LeM>dw$JC!bT"m0*H'\aB2ZdP$ɶ,g1M0< _xbBpRKAZ,OPrfP4TJR`N["$"MTR#*#5RQ H |Ѵe9krv|.x0-*Ig{ Qz8R̟fNKØdE@F yGCމa"py)@c#HX4:(bx!A+0D8-e }ROz{i6& 8@eT2  bhJouDkFhEqWzͶx+|vDdC:/Fii4&p{…dZvމЖbR>gxcXd_c- hC޲ton:)28;&ydQ)+&+D;;$t&/l _q6 ]ż,﹓lj$CF^Y,(^pzUWIjU* LEXTp;|>`G7*v#W|Vvtz%1@. @YQZR>zƫқws_$.ƳZ$Uǘ\Ų|7JD0I7Tn9Y<Y?0Qoq5<3 =Twe,fI\geg RǓ}?eP4| qzquj7r|Yu a2j7*):.uEvPWSWz̵f%OB]SQWZ]]%*uEu$gW] ?$zOy c眞?]|MFՄ~X>aכavkbX,ނ29艨i y2j:KũD-Ǯ+ƤSߐ%gZ{ȑ_eqi@p173ܝ`2p70(ڂe+I<حmImuSd],&4ޛa OkÒrzD.k9fGC7h IqZژ2S"Q7_]ѧI8B3JUg<]?~N^"jJ< p:|sjOeU҅>GeЮS6Tm*#Xyr*,D0hP_ՏC.fa]Ε % ͕6Fi3U~XWa]8ú]?dTX)[],c"Cg",c(F*ev+_@-"3҈#' xLh-SheN(>@b:"9m<t a,`bVXz'0Q'3T O)UGyԘ̰l4ޙK%[ΤADM?].fp Vs5vsZE~_\3Dm-5Vқ6J/({8mSlM rRl۠0^!@` h}4( ƭԃihX=k?YZJG"P| e3icFo4 4L:bh7ݣ[<8ۑ}tHy[a\SvR@6cjfI*Zd<7sx[ͼcv0.:z&tDvWGGiQI WCu$JFyNWZYg >e4y:H$OSC"dIgq #KaJf+#iL2E6sPJDBo :$HckՄ4j$cMhϟ`9< Zv繠ӼE`/@׾#$-(R׷xx@Z4]xN\5]:l'OkXCZcIlIⵙiG Ş:q6*v6bQ m s14+C ɍ xlLA!bnա LZImHɍ6NЙ J61>3ײCۚ8;+Zj>?!9Ww|^5Bw1UzzޔyO~Lrf-J8b" XP><mF^v&GU6xqܣ)ѻl 1<7F4rdmglM3x% x4u^IRk|:a&{}|j?eTBm:hQ9Cy*39ߦCUr(A8H0r9hUl Tه.V85 *rPݣd VYXf厜\ ;N.LD*Nz.T-ѡh*pVv'=`}vK 䤔#0y! 4M%ȳ!9alw"sJ+\ .b!d/%"h!*cmlKl8.}}VpxZӓF%mp:יbZ=[2H'ktsN9 M檽_^V=ZY!FGO"8"]uAfgVJLIx՝v3GhkrJ9V2hkS40hIQ$974shj2&#Z*4㡶l yg .$*}^dYӶ Lo T~0}'ӯbGu Ol"D9ȸ em̈@E]0&Oݖkea&MIN.K2tY4mmb0r1Ǯ-x-[mYnM)MJeP"* @6SY%kx&%'7IE tZY 8*!2@^t5xB@a&H6.H<ɩF~'ܚ8aԗ~bV-xElVW"vqFޗ& -hE-j{QT#  '}nrƴQMIr+s3!p57$pl1$m"&vQj6)ٚ<.骠]|Ԋ#%M ,{O Jb\+casnK;>p#yuMn&Sȭ%oQ4ov?>GCASL9KcKdM#l+,U ,- (eF,nK;-#X lr]2YhSKTF2;RYdZBN^`j7'_X./$5N/JC2HP"Bd+)YGL ]B+5B ;rپ'eSF0FItNlg9xZ)鳑Z2?ܞI ɞpIsO]?$C0re@zK4-D) $(&' \Ϣ=C,U.R34AMG./dž#&r9nMv'CiN-%&JY31E-zƣ %x'8'#kd)w-­tB,{ó^xEW'W>?8`DN4n=YٿO7tX۪`[wr]J&Ew) +R5{+fި s*mTNT6{uN2G?OoO)9}{ZqB#0H:$[Ϛۛwo4ٳijo޴CӖـoo hkl4ĵ3ۏ%@R|a?i7Er$_W1K4/%'#2t|ʕ`Lt/ZsHÿe؟*6STŷޯz k?CvPDv̒_ҬBW[r- 6>`zq؛R@~!a @zߦA }B7pqrڭKxcɏ? 1/y9q˛m_Zܶ$Zh0E~vH뙐Lg@ZИR*t|5hG%:* H*gTZUu/rQV `Uªgm55Ki RicxpӸ\,%g@TƲ[KW[v\mR2"98dkn0Vt2e9Da$QK?Sz/O: -[»W^Eol+_KvrDb [O+*]қNT(IJ;%KE-f$+F2<ݎ;,]Nw3ϕ.ԯ Bqu"ٽ}to -/ۄ6urWۆ[Ks:k*?N5o`w{M^.z<4^>LfGe_icft=^z5}z3_Ilч-qLkP/)ʍ/.[,jmWN:^Ae^E#"B67-x;ngyq_>R@EAT]"V~Cb%} 9|~ aMsV`Bx5d藬ǴMYW})i~ӶKřeT G9^L&7*Hu$L$Gͧep^ijV:t7*  f̈Pg%Y 3>x~<]oky+)M.nux{s2Y[ُ>|\fCbZ;%[Mm]j;=虞ʇLpbgWͦ$}~v㼁&(3]S _82|׍gk`r{H{}1y%GWY5/op}0kjTی*o /=f ry\|E;aFg@u#|@W?YvbHU}7Q1@\R{¥ JiS-q ?"U, :7Ɔ=^ ,V9YaՓaj7[' %~ Zndԓ&`]sgKGG >ӐvkҮ^W<_<:/~(;w_.6 N3EXB;J^8gT볯 p}tq3^&'<./҄s5Y]}ό֝;k{U'6~d> -QT I)[$k >"H7ϛNFوЀϰ&%/2EO&E% D'@&Jt)k&:>n>i={Exl p@V_`:5yd"lh%YZBIrRMH*PtA,`ÀB:mߙ:LzJBU*>ǹE$Ϡʁ5:02k,T72bMA-Xh:˖.Kju+eUn13!ՒLuVՇ,6i={k-нy@ wIEQkUBlAIA>H6TN5t(gld:Dk)bs6JP7bQ;# C AbA ˫3ѤޢVYevmVAȊ\$Q@y劵^hLI/‘+:"xΚ E}Z[fN-_1ʜRlzD&KKĐQ$2LRqӻ&4^ӯQ.ee)FPVfEJ;ͯ^X bj1N㤕{&% S%CG fQ3D猳*Mrh8uJ'tҁD$| GQSX8uE^(1ªMBP hmvDX/ץ\~ӡ_39)/tŅ@-O+(-r%P l {D `RefFhg+06h!Zil#n~օ5m|I7dqi/e$*dIH&D6^jݜl/dzR0E h'")eT]J1u:voA</٠ ʫk 4m d1҉b=dDlga]QA?0胤76!> 2&@H$ 0I$ιD*F)~A'Cf솶졻;=ifV7vK͉;y;Cf6% Wn#|%i!Zo,-[vDJ`|w?Ij3AvEu#eTLmE XtA)=[bm';I]"JAƻI9fB%}\ئju,x!cﬦ܋[n-m{rtyyXi<ׄ\f[q~w7W?7|9vٳWeVIWWPM£-Ͼi?X^yX75JU4|mZGyw6r+R5\8W1+^ Yv,ʹjx+䫴kݺNJivl5xo׿ "cǰuPVC)埊7y H%tqȍtd=ڎXL!)KH9oɓ Y4r^F zveT'^OoXiYcokץOm& E>W%b@ƪBz< PJuTXcQ25Nbƻ6߃2[fғE~˕I2wP6|;nǼv׀˵nP1Fp:Vzb[Čxu**uW''63H0d~/9}N_DgtrEIv1{ .Ytq;O|L#xZ622Ǩqdd̑922GF1BH5RF7RFHq)N#i84RF{oP[2X`dUq|+gUZC8R1#fZ922GF##sdd̑922^[Qi[ xF˥4:뺢8\Е1VJ;VVcv Um8Q8!+x *3}bdB%vI| EV[11XeRP.9ad4*3.,YƸ8qg:2嵡>gl.(kD=I}cdÔ鈳6wA%I2 [O`IkS0eeTZi%n[ RrVd$h | Rl2xSH\ B Ui0䛯i([>W_Rcƺ$mIJ[Nzl%b)gwdBi@Ò`)FWSecm+F +bvod$'Tj>eH$dc%UjP﷫\ڪlہ,ktǛ茋j'srP[` u>8B 8Sj< ^7@HPDNgTe1q[/`,r HQ5(22 `\/BI:v S5AsDga-!xW8{n8wBM6Wl~فmM,d~}//-"nܻ8MUiP/iH nM/4$HdDSzpݔ5j9mI_XS~ާ/ ;WBAADNyX$&"P.&]<ڂ'b ԩlLAMQY0İpCJwJwHw:Y%t>rlV(;EJ BehmѥڢT)j3"h@Y0Ēhb.qTUUE,8r!b3q>gݽ)ns܅/7k֒_}L/W G)%k!hJJgM1*|-G)iqm^.K1xEFrIAZI%ZG[ۂw`ד VYДND=1,| #W,8 |otw |Ti[JF>>b댰[J_|9珏u!ϕ#wF_%!CRS=3HJCDciVWOW]U]]Uze17.ãqqt/yD/y/B IdLpLdS:+Txrx#^%l x$ )[x D佖S`D0m il=gck'dS&q),TUɢ\d VZ"i0WamE@E k.B ɴ ]QQ8H$2Jݙ 5lìweuYwV¸ɣFt[ZɍӺIG]H]M'.(Iк5}C2@&q`ll,|)Ito>.YQCRɈ+sʝjdVe@( S4v#7sP2zE˜PG K/R0飶$qbdF/c@*72fcGlJ6,&b!ΌxGilˤwaowCWs'ƣ^Ȉ,Dnd``B,\QR$T{# j.oXkHta % 6>K(R:"*wXn.v#T~<&6Ɍd@A%z@nS ~vlH#ub;Y3)}vB>ܟI(zlYBsHx^}?zhk@5AI*{@QG8ǜADWBFBז(DAтX&A*3Sc@]֑`{)尐.CH =)2( I;mQ8Pp,Rv-Puꬰ}H|ԸoaQ'ΌW`?OWߚv`]o',;xn^87֏ZUN ʸEr4" }(C볝yUv!F"r2E'gRP;1y㫓$u>[DRaqkBBCRՈ Ƚo/OEbcn|a }SU155-ini=˗_4hFUʰsz.o nnC?.a8O%jTػbcLmi\kvQ<~rqbv 3 ^1sa>87w)Ie5CC=1~{O]ݐnFݬ [X>T vI5Zub! z+@^k$uA|~\"}9$ 6N .&c?տnU *@&~oqFsǿߤ߾z~7oc޿o|;_ ,Mᤋ¯bo2g߲kX9WsX>\qm̎j˭_.?ySjyXx])Ʌ_~؆e~d גa+ZwkQ]:p[?^"(ᾦuN&@tӚimjw3Km5EP2XGA )р8*?(h~D8:HlXB^Y?ѽviF >J trib/( 9әXāL$ssezZ5X@A~Z9)oKQwM}37WɧCIߎ-zxCuOfذ,[A2x<RV򵦬5@FQ)f;hZOi1*1{Muq.ḇr1s^r@g[FH(򥖜N)bx3A2-oղ2 dPjZx֣?3=c+T'R$VTnhbzeU~LūQ4 Z5 |ʵAXKl+TNǦ d{Wy/݃y\.F&PQMJXr''C35BQK*=DzY[AV‚,-g zu9qB}^NMLwNT? ]LJ%.u=ԺlZ&!j~5AhYBedw'*=(-\g7ݱc#Co,ϖm3kk;WiBrc֩njϼ=68KY")vB4ӄzW6`aڳ&>⓷+lrQL[e_l|ڛg.gsÚ401\l<@)'UM +lۦ VL<궉[mRɡWXV#[pP[<׳B(N2?G\:ImReޯ-:6G [<} z*jİ7# M@gB%j4iHV6aR/1G"`"RSFDD JEV 1̎ltlϬq%lMX=n>#nAeNQJqnuRo[>ݾc .DpC`Y,-,xGI4 G2C6vCevv)8Mޝˎ nVcQ:j;1cx bbN;e335V.;%q!yH+*4$j@zK 4jG`yʄhT H! {GU&lXfY5-*e qZ-C)׸Ol)`iaK^zslk]jkWү [RkG8s~MMMJ8!%Iyip`l/01[ꋆɲga,w(bΎk"G˲Ԏ L1lrfo}NI4T(Wujo(2k٧q^Bĉ \Φv'ע*lgΖv0Cgզ-mt@ȲE.ECG+Q0=Rd@F#{*b0٤}?Eo~Tc\Ǯ) QV[V5@ Kl /~¸L!Qt}C|u:xI+yLFe_Sɉ%؄Q 3"_:s)1X(j8.c^/tb}s_ >ތ٦3Ī*+.;R jl=!XrefQdC"/kٮ/x }9"v6>5m8HJX\L*t ^.+PelxU=x2.y0~cisnuacm9V(ooY ҵK 2F6恲1m?٤,. ҬTn!DeA'3I[jquP I5Iik$9O9xbMH瘫fIw>`Q 6CQhCְ~Ї\ mZml̍UM"PXFet \MJ^-vg3==`rj-pLÄ˩{ >R9Rz+bb*oWא}+S\EZ vS\EJGfLq}S\1֌Q~1/F(#h_ F(b_Q~1/Ob_Q~1m_Q~1/F(87L6J3n_Q~1/F(b_; #g[kdnЩSMD6!֌6Lg lkfdS&e43  ,Zk>*GN>]/:U]Ȼ.7谗&$CoqK9R@AA3PtbBvB y׺ރhTm⟃iYzG-BԆ|ċ+y_ɞPxpt&/} ~߬Ϳgkb{M<7jg TEseC^{N'c?OO]'='ohx!Z1ӃL[!ŕJÅ7=o0;zz4MӓS6yqԋi讽.WzO 7Y\(ݟS+;.zZzG^g /uoi>u~ty-˰}.h{ ]"2|֏|vz!"N)Gr)xy߽RNwCu8WSvjm)1_=kT쐂XiXRtBS1zʪ=Vp\UŨ#-Ń-gUfKS#j)k7q6O8puCvrηᓟ뛿mSÉP R\,A6yκwXE; x{l*^2^[6Ė3(b·#}EGԟ+'Ǔ˫79N?>k*KQʔ):R&x]kٲJ~PB[3\h m|rwe,;x}ц8RS1EmBe C&MFJ"P*搵Nl"F%eE.btM@8_п/w<*#,e}Rk#prҩWz<0?3@^QH# ݮV}0=0e5R/AC^,$G+#;>~SƒsFUƣ4±BO1#k* S⚂A?ejns>9v (๠~$q/5k#ϼ~Lu(jK^KB%%ߖ5 ~N= ^&ч?2i3~eR1~nץ׭-vTߠNb5wR=M l&T'68/M|!uvCziziR6B6ɒ∤T&V|VNUNCJmTCb6UYy&bq!aйhU 3:LYSF.Fw>&f|GNO{vNnMwDu&Z@m M` CNHn9F!:;T}4K,뽎UB1E,/6t-A幗8[Q Rc)A9kRv`*CE]19+uؔ?~OCUqB9t?XeVzJYޖL"[Ȕ<E|m"P`7мٮ1]~_ywU*} xfvy B8%r:BXk(/l[{)ZC[ `886kѻsÝ<~d)anͯdǫ2e:H4!*;%8%ɲmFhp *𙪑U H:5d.W St N1Y +u<ʩPTd̕\ "NQnlQ!!eRs<}%ڜ /d_^JW|:::BH_5FEta?uDۈ9Q)d&UkrR1 H;Ƚ$r ܫA"\J =V)h^m<(N DSPFkMߙAМ ю#^ҳ>]fo{].~zsYGMBAOo8>t>_b$mObJ__{6!d ) ?Z6D0hOŢJuRQ3ꤢfI%Add[<"B!(Fς1&%lL+!2G|3%[=[!~nJߞC\L{*˯4(9Ckz2@QƔ2dRU^*AJ8Aۚ(8hB:2CMSlD7fQf{'g?;8;VYzCa)k8ŒT>+<$fy}*/|b!4tʋArqh4LAF(B@ %mcnͼaI(3"~4Yg僽59m&еh%{g"Hơ\ rjHQ"&ԐJX0]m*%ADU"`ɸp8[g]vݰilkv;%Z CQT *, CfYS 85TF!󻤝X@ m%VG:3&1ojlI+ʽӥzP|H`)_LdZۤگ'FxxoD%rҽfsg'\Is3Nɫe{to㏼}m'joWޢ'r+(h<8KSAH,&<ZʫKβطۑԔ( bz[o_jHILb\vbz{nlQ t4cW_;B=|^}JvmW iVw2e=5y߼agNO^+{K¤ZVfB0YiK ĠIPS֒Qt7ύbkXJɀinȉa&/bP j檲yn٢ܣsݴcWm:{m3@iPDk E#X}KU"VaT$$&'V.]aD"Q C¢3_SXegPvG1Oe!c3nlKR<9ƹxnGGY& #cLlbM6؜S3 7-VynAbB+k3pȎ"pεx^cһeT7q eָ%E//>ˍJ1;]YVСIբs1j#+iQc_܇_ ss;VR6,ȍ7q i7YFひ и&pof\qMZ0sѤvx"HZ뫓Ӷ~t%ڏr\"ȏ׋Anԍ^qEoGg >V3#04$s7bchJ&U&4& ?qȦ+h|.+;(+;Osz*rҙqTzP` >U`%<پwMGu,?=xfY|bMEu* \ 2fƬL`8P1*PR268-d*4[ܶ\;6[=lGlm[Kp؁mm5 ^j7=/#/δ1=ndFP1wH_iMm>/L@rd,pO[gYr$y^߯jɒemKv0*⯊**e;˥"BA$?&`:J^_ 8|B0bLUf="\ בAƨ0) nA)-v6g;H-r{I%Wq9a$5tkWMYᬿ|8)93wuʉ8[FW ¬9ݜ8o"+=8A G ܺS'@ _qqvC FC rŽљSx/]x (MY?eRQ |M(O5b7|y:i8d)sІjSsqe+%R^7STO.Vh4pZa3wHjnm}<No`TҘŴy>_+o'?VYf+K7HtnH1~J|po>8iiT^=]6uJYwhC +3g1Al7}.{5{ed}A64VTJ{8[#cp3q,jb8=wjF(LDž]876|?~o߾;Lw?~;_q&p;N .spy dǮY\]k _à!7{gs+c[n8_rA7xTfDWV׃PWm\3޸5RU/CB4{ s=ҌH6C=$Leٚ^k-Hjm_swkKM2A ֍JҤ5_%/yg#=aj|Gڞ3+Yp!)Kt xLh_qo5$ F0s;g:8ڗdbR]mkJr2W z^rVu9zٷbh/QuemAo[uあD$7+Fɴi@e/8| 'iAQoo?nG+Bzly)^t)FwI䜻!cb)ܥUsb*rnF:.>&ڽkq8)w ͿyѝO>B]+qmKUBzP(W<<7~xbfylG CQQ,'҂OeXj[o` 7mod6 >܃~9lH{]-lKU?n|&)2VB5 ࡜mD7A ήL,8 QJ5"ykE^e?\@:X{aRGY?0հv #`0.VͳYP̫h"ӏmV/zWW[mq)#? Q-Y!?~vp)>=Q&v=IW*wZt{_PʇL|H|S_rvM 8*VTfڐA ArFxeN1t |̷9Z&Eܰ k{W6PCTA҇*&h7@eRsT9* Y@g-+ع_1 \pX;K~#eErU[Q>B;X3 >2xQo9c IWp>DwD+*W,@CxQ<Oyk&fY{.-n#t&WއZ놏R}tva6391l6ZǙbRߴfu~^ăe]˂hëym379Sxss;:SylRiBY/R[ pHk匓hӳHS}iʼn>޳h!Z:2k(.C*Rp.(]x'ՖJuJ@},;׊YWEsZ-,^?khNMojp-½U㖡sv'u;'[(ʀPRdIv(I2{R:¤UfXQW]ME]!}%N]Bu%\ +y0*+V*sIe^B; 3~_\YupIyշX*0AI /l<_)2%p/JimA0o-\l, BxBk3@XLIZdR6`DwV%^HTpxqb4V>ramtT .SŒI*2U0ʉՌ+q(N 턥,*5Oۑ),\{Lq;r*K-Ρaʳp!(ǒpqJT't™21TPB+X_!B>%(at\`$Hܻuܙ!N)_A!m-c4ۡN*5'tjޓ SQ ˥R)*I R$|)+?-?Z@x1  ]BtRR/: c[m:ک#k?OESػ2g*)4JmT2%Рj#2%"ZB8ʁTRm[R`͙coՌZ*{ìxrK&d!DE IH `cZoeP]dk-Vo^E_20<]V5tPRQoɾ1Zݺ}SLTgdM~7ޖ_R J++my3FS ɋHLG ݻljnHXe:p+'IRf\⠤E,3j!C.P፧ȐL1.E--Xi#132PlM2 Wz\^0xoh.>^d9m >^w`p#g.g vL"UF3.ևݮ(!DJhV}\.Dau᢯'\TQř| \uuؽ~ Z{lw#l-gQ-yy{Xc{.[<,fuNYwѝU6cce^b~EZ]أ6YotgrZ]o%Ϟ˩?fW˕RU<NLMI'Ce@\4!^GD]X649EE4 vK:zL@G?ި;(]ɜMM<3L iMRj#!-O$1ZI <켢6F Fa6V;E!2aEh`--st+eVJ' x MB{"%Ch $hYop_ss b둝 G3ՂHJ;OqoLegNq1ʹJyc1E5`Ľ hz]UQ?~1xk,9#Fi2iJ1J~čO`iI :q,[c{km+GEoIVNa'xȒ[[T*~U,VMS(ε}E~߶}!vbC mgP|%wBϗh:ocYHy4q 17 2d\J42i࢏c¿Ux@3cb1(Ks_l iՖ)BNӻ`)VEU@90pe61x4 )r1`FEKjɝ7W;;\k7 u+UBJhOrKRUm/}d<'Ѯc^ E',YSGcy\E-a 4oaT=$z $R/KC"baт@ @oᚑY"Ȅ>Nd2DQJ,օD %P0Ke)}Q6#WBcBk+ rзjlD>A"8H9_πٿ +AkPw pL2 }td2nd|^#N,xsS" ҍU>$hqLEàg,;nYq2=,↓Fd$ ^#,Ʋ*4^;MNBS oӵZMn)7C7\!6l~h=PUxMLk18Ӷ$ 2/>3MmU29FEرwV%˹q9+EgOgBȠP\Zl{q7 dެ|}!lwQ-ױ, D|C~ $!v%{3! ';Fp_bqQC[5ByO +E%B9XDtr};# DD%U3_1#Ud5yL;.h*|MS"Y EZB26+I@!l~D  W^F53vtg-v j:dEicޕ9}˺}54_d(+ӹ)8yr(#8ͧa*)AkrZ$϶P*dZ>+U]*4 v!XG\26e1b,3W7s2:ǘg:2&f KBIЯc^Mt+pKBRR5c5vvQ TӅVcu! y U9*cdܑno:!hиhr1|Aɒ*A䎌H$31. TseYi`ZxjjSLJ#xSk #MC1H1P^(j^aFMl36 !e[O񚉹/Zxv`7fMtRfm$#8Y Lfɮe%72r#,OL>̲dFr$ !G$M&&HEtH@FuҮvK5v֨_-q_4bX5"׈-7Ydɤ!S h2Y!H/ [.$A@d ˵34m$?H$ɬEHaRIqM K9B(3Ucglj^'8kµOjzQV֋^/nb.(U΁B<ˑ윣!ĸ V:,^/>^˹}˹ϠQab/gt[o-Gma Ap}+k׫\,AQhU%bRqҦc6 z46& aBW*{^.$QHClr]߲Rj0@`01!I GtV ILȺ9ն};O͠bZ%tJ>6hk(5N<9@ p{&`ȏ?MGiw\4|mae$/ӌ,d4\'%;P^FHUV\,z>q1ŁhS_ԵJmѾVs~p~rz#sÑA#a Ԋ\ѿ34ʀ5f=iQ2|H}Ӳ`Ѽcu{12sy4<:^-ث'_Ί}州QuegO(餫 r6O4e.>.W=sx2o ٫`{] ׮JjJFJǒOi(fi0Ƀ6L}lQEk_ b Oi9?˿~?~|?>=8@A ¯Ob2w_v%ruV 2 ΆkZck{~P81km:&%2zU!8ٚlPU/b\LggTmM<2ITtl姜|xVa;.,g)\hLS< uenuuhE ͬߠ.F0k]&LeO_ԝFX2\ ǃq-û޻ԭWtWvk¸vQDו%z`rcpuq\e QluOrDA .x0ve! 7MjA퍾/Ϳ_Ɠq)߆/{<%aw,yP>ЃU=Q~m{R F{RͶGZLZƤTdU!lQ:6jI7Y$ X- 'yN]oeUHzXVm?qLċQ[i^ RY=s}<::WɍWRE +ɺzԗKKiiI&e '9 ,pɤ30V SQadus5ୗ Z^6]BK@6rޝt<WLndm6 _R{bۜ6.TAC.:$sL:Ș=UE#\ƽb+(1C*((m<x]dK)3#Y\.BCZ1Ny=brDePa5s}P^|8-'Z;nʲ wg|ެi;q#2l0, C*W0kIe*U*6r-mWO2;I13/EY )I1 cE/դ~/зzZtp(`hSE-*"qw&+J]`&}" kAVcR^D֑|*f#b0. ֤2cU{O7o̼cn02I܍nT-_= 8kײMMe-4mN2jj;PgJOS0,ҘsNE"壶<ɠAM9Ztv]e˵b`2E#)`Nb45 -ڸKA~7œ;! l.b7<*-f'w[@.p.ڼ7] œ ^Sׄ[IuMe\m] ۉ3I>xBv3ݨ% OHp{*7IJhywhվ]kkEIն fjkR3/)#%Evd](m'v7?UU/n.ŋZtNW?w[QZdG<^κAnWɵqeǃY*{"lsKуN ׍79"mm< whO:SwoNԙ/jia)۱eq{O֝).Ĉ!kof+g;yy|OuvpkaE^F5m\#69 %9N;}D_άZ}Blg޼[6tC5Dz@2wu5/rpzC^@n/Yo Ӎ=7>l?9^Rq}s$Ą!z9c<pV-8?~w~Ng+?Ǵ7!,Vҕ'vt.;x\/_eðf|5ShoG'!eڨц+%4w%e5'JL 6\=] }+YH R1vj՛d0gArH,!_pUլ!Q l~ kBDs1|IPTU\Ѫ֘mT]!0J> K|5[2X5+9H3BЎ 7J^  ъηAC!V66Jq%؇"$b띓P5yI]cꙫy|6JL4&c?oE]tSIr,RIh kj"Y΅L"־ݽnC[}daﮝ?x?ܾ<R< HAڔ) H HAPwvCs \A*EfALe>+((c6PD 9;[Ѡ,\Ǝ[X˝AˉPNHTc KO@LL? j sA"ܐ?$2 f+zdl4)b JZ櫵 Po>$ڳ[7LnzjP͋<h ͝7o~/Qד RJnV6O_r)|ZQ?TǻNNF*^]?r~G.!^u7:r~zKz{~Oa=nk|+ſjC>N,~Qt}TzLY j1㔕3<0̢xqU\"r6kt5(1ĵ@HK_'!4Ty1{pl8ŦjM"j&y!tWܫ0nrޟ&ozBsH"MGMjLIA ƱՔ@()4X{+Uc; B%[9%"RnEliA6F'5fao7⬯.m6W?bοw얘b#8i(M=#DhxehtEnw\N4eٙ ldO5gyQh@fG8i*Da 18Ŝ%*iM޶>0ʡpCF( rCsɤl֠l5ҵZsl֫H}tC41gǀN='6rXl;.Y|1WgS.ISҔ\t`RI#VjXCTW~,Zfk`ߪcE$:|J`Zb4Gg/W KόU'|&ӓ//{qqmkɋլ[g C>ӊ{UOvϿJ^y8ޝE!UW)fi!L^je] k)Z[)&E ku;I=EhѓAPh8L *|0|+]\vd| twKLtzz+FRę4 0ؐ HlFi衹3*J?RU<TK{Ĺ>Jw!zDy<#iI*Ny]mHd%\gbq 21:1Acc mVa;kϔ`O̭DOB'=m8{^~q9mkNaZ_~بv,@bk}`p1ZnD bdM)tg9Ҏ=[G] yBެȝg %A(UZ >F7-շ ԛ&(N0MII2LQ̩ `v%3 @]p3i}jtjH4tK zjKz ,͓I zht7>ʛ>1yg/oy-8#]ICm}g`6wS@x\ %yB bJg :~پ>97]Dq1NL.Ʈ% Ks9>Y4>,T{\Oo? ƺKq6gQ(Jh ɆD.eXPZ(Ć \6@>`~9B.NUqS01ZD |T;ʊٜ U-4֐I,V]["J M ˹xJL^R Z M%r"|PJp|?n|[b]YNϘN{V9c}ǿ<5ɒo'l/S><9-yd5$yH)٪.!0YD@> GLhx^|`frQ Z?j=ɦQ ,Y٨w59ϣqQjQ8‡8 ySEКNĻ|)7>:?Lԇ?}Ho`Eַ&(ww_@&; CSZ p|󸚰.03gą1;p3[_>0? &e5|8H-s  )\]}Ol 󋲸`V.|By &3>Ҭ/p_tN*@tٚ^k-k3m5EP2XGA )Uр8*Q8A@=qP^ 4Pb#Y%،QrN{" /3Ebi!ݜLr_:É-HǁVd(C8ZՇ@a1| jQ%N2=]20Y4dsFW)6dD5U9 l >L݇^:̲a .޶Kܛwóf`%ZS-v5ڭ|ӗ7}o`i1[AJE$3ks_ eo-ڥwmo>aȉ 4ʓϹ`1X>_wQ^{Ɂ;<@rFϵ,wJÃ&^2[cW`"ZիUժͧ_5!Q WJD BZY*@hkGE@a'(G}{nΫ;ʀʽp ς9NBu4 z- PE6i;98:XAfvWsgD:>aXqASV8qТ6''E*v| @_I&:H TF.drK٭ ׹dt=\`Gb z|vсD2/#Νu1Z'_aooEν'jgxr<5]XB=p VtmH&Z∨Lsjft."X`T9CBXYL^jʈhARIuh#2&"ݱ3pZָ􀏟 Oe̷YX .rZ[oJ] R3Ks1|]g,XB Ҙ#KMF)[AGAd2%9ƒpg O Hyo̅q*%RQMRt)T,4k,8'ǒE0Og2DOK{))8%tg%0l1tf0`5oiK'ɻv¡սZl>h(Ljx0u^ߜ|ጇXNY Lk օ$V".$RgL{ g=<_-$5L1( Ȅ%O3A 13c; = |t Iq On]YqĜS6J{j<"A œEp8`A":uHz*}A˪EͶ?P2 {)E][v:=K]0DU]* g1g膲@$`5: lEcP߰Q ꈙV=?&ig|e&S!ڜJž>=bT$XIUk>{[D6UBUF[􎄘><**b OGeU1ԛJK-|"K)yeʤa<9&}Z@Iz|4z,/mѣh*j+}tobSgƙ4o1G]I;_Tv[ vԸ M2qMQHasf4X n˸)۪M) +عGtV bwzJźQF0ٹr/=Lk+n֮>0HuO4ެ3h_"'%TW͒Y}n>UwqoŲ`' j^m6Ւ3N mlUo[ B8:/q;J(єE┴ ItDY::2+`<] {eGoc@A@v)I F,LvJȰDB:az ]<`Wgeg]۪ezg|Y.'Wן:! -Zz4Me<]yUEv0b: qP^ :O2}6M˦(68̿9N^`p1rgTV-Lɳ)e蹔UKJe eվ²ja?a:>% C#*9r͍IZI9qCu l:wi(+&*+f{Fé&[ Bp>4ؾIN {*B RsżIP4Bkm1V:$P: D+T4hiB$ Lc09I1p5LEg }*I/ȗA<LioOai* ԯwbM"jsQ!/WwBQG)A@T0QR BK^9+e<ȠTOQD  (Fô za0KOV\j)(4X uJgEC%Dh9mDF4QAJpDHET"xѴc: tV/ꣶ5nH<-\LXDŽ(Id A>ʨ0&rF*#ل<#6wI`7Rb-?UhرZF$b vԭH ޯ@\ᔶvVz;iF{5X JQ@uD 0m"R/s\I⍱Nqt8++2n#0]ެӎpHB(-.\qO%p!*Y#&=uخ!O!ѓ}!Pd[cM!B-+XÑ8j 2"G0Pl9pg5LJ*1rby̔Y}0d[SW$X9o.DM23K E"E }Lr<'Gņ$*;E G0ŖY3c A5{Kyʹ>fYR%1?Q'B>_@QK@B\|*3g*tp^3AE{61CoGg0)j7\52 M6; gE~`N75Z;ˡ0?zMn)s=d@z" hEǸO8^ݙWsCt)0"gR64tHlP䆪T: ¿*|  ~IQN)93Z&zG;.vv 2)np09sH[G‰uX.I#ҽ!w۬Xo믰;kdUwaFڦ>Fszz)=tN1Up;o8LÁZ@1<+j$њo弾T:՘PnaAw az\@ifSht'>Y5,rfT֗{nx6 NPP6bQQ6M^-?h,H_30aRGLݤ3^KTioIrn-{ j'Q-mʧ /Dh69r -%&  ^E[ձ6tұ"ym\Q%%eԑOyjl%P)sBkm7IhN*Ȗ'ݫ,͘ k:I>!!60C !Tia< ڛNR"m u|%J{h4j6pa%Βho{i)ݻ ld0?iOb !s#.u V Yٛ2~U(0 L ζ_E t;rH'΋.Πc&'tSű]ںn^$u%Hy'TRcL$kIWdo~sqmiǭ??ɜ[4R< _m#m'c6['fW"^ P7 7ܭp'~0hS_Q*>d㻓]ryqޖe1i7qIV,j̡9~DSw?ƞ$~ Erl#"rZr,+ΦB:3]fIZUSڏ{cb)C%VuRIK'Vw7**Q. 7*qh8@Z^Ɛl~{?{cZK/on}]EƯ{4˵0VYe*)NƒrB,T6Q)HKUPk@*(Wt#ʻ:j=te[橶Maٚ\+v} /o!ߜ]K#DQJї%锖/CX:g NsD*?I'5kgͭ9ssI)%;-ЀfpI^v 4ZS-(͜w hʉ ǒ뻑+j1RYP b5w#W YGUnrJ-,W(WFH/ao.'Aތ@I#@x]_SZEYzeqww߿}˻'b=S=f\^Fl}^`=Lꄭ:Vd:T|xUo^[|vhv݌Im U H"v_|"yI?'±N\q;>jwzX*iwϲZuX~nzE@W@% (W*`>g59^%sV^;sʜH2hރVj>h5y~}\O{$q.wO5JfW]5{ۑ\XF1U}Sl+P1;FrNilE7rpVc(,W(W9c#%FfWDo~zDxteAZzq.80Zs(6weȕM/=[:+;+zŽUbr(J)-vEL\(#xTҩRp%(\r[VD\ n@ujӑ&~*l^xL0>UPFOqJF9ka\uIe/*&VV~>g>ꑛhV$҇\-?y9)f1NQO00Z6Sw>BgpUUu35 ػDāL T \="3 ÀF^:=ujg:BRJo_8TXY9!/XI;! )7T'0s?2pm7^%hS=~L7٫ϰ0p}raQSw#ct՜&=MX~"ˆK,36C]4͑1ʕّ\ym3\:\5\^ɟ\h[g]ܞMNSz;A$] yW(ybr\=!Ҷ#0֍aʻF9T\=\)e iWk6DJkw Pncțd*ymnF2ɴk`SL970ĀnmneMYֺɲ6e67-Qw' 5ԋShSHSȖI4`GbthWS+PAJ['Hv$W nn RɩU$?ʕLZw$Wm jT_\̓^~1QE{1CzY J?m zY5iC<+ .m Mi| g˫KA 7m!ū zp;]͈=9{;0tGY7_i}7n.~BOb]ExR{7qϨyǖVRyq?PA2+ղNp usjΑ y F;B~nk[xO7o.o>,t] \e.ݡ]ξ|rYOẁg>~|6nq9ULBxYM\\lEA֑hAhb,N\~*G+Q2$"ئ6ĘUʤ*,ZMHSt*[jE(~iH˕Lʙm" G0˨o UDi\rBBNZN ،IkK B.:Kh%91d.R*fj بNZT#r[TlVS-LE$嬪.@"<)],{]$"D}E0%l4:3*e$>fx ŤjKNDq06[~/X-1ᵾY"laB,-n ZgA֤2CdM 8SACWi Vgke6: 4f 7B}j]@F(o,q.$&&#-GXyJq1ONAOMlC1ﯛ;dUIZQberHTΉ$ڢw^ƠLG |ލK]Qxd(Sp}:~A?BϧI]V1a !BD&/GHEio21[8ً;aџ4긘(%r, N&!+Q)YC%}=7bF NX0jJ!0: X{(!x`tQV SX*,ˈW.XxR 6blph(_z`O|2 ӊ ;8x8V&ė9 BH2UL} &]+uT5hfujF}b tKFT` mh +Š49Eb>E H.9msC8ؒQ* PJi,>[`u:UIP*Xd(V(O cj>I S qxb=rN""d"hYKodď7Z ڕG{<]` m3B&a5AW /w;# J=$]N !iZ.TUFL1L: Ük=/S({ąsA-}`3Li`UYkzejELkaPC;DePEx(%w:K6G*Z뜄 X " P6C֐h T3ڄ bNZXdN;=zM:PCX1TtqHU׈4JCĻzu;TR5̸\"AD!C3f^xЬ$s}ƇP.Xzד:l$QJU~B:C+Δ<.8!= Q/:l6orup<ZC.,:݂IBS-a=r7MI}Y$J|E-M~'k%&f+I:`7|u r~ܑ..Dkj>ij͝nZl9];].~{Л.sswWz\YIO|Fg:V8Z~-łXl^m6-v>يxxtso]"]]Nަ)8ug$}' DAs%ύ\b[rʑv@# @eO?8@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N/ ăh wxᮜ@׸V@(Al:F' Z:v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; 0[rAl A7ZH}N Ԏ@; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@Вhl k> f'PNcta'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N vx@7Տ3:jJ?~ε$ϯf'@ؒq \;ƥ`օ7.el\:C8|vnU3tEpjNWrc `:Q;+Z7CW6CW6"tU|Subt`gĞj;nϳmPzy``܂"c^F+%"ѶBWdPZtuta^v_ Rh~£P0b|Nȸ jtqa&ž[}%'<ᛯ&yGύ=jǷKo}^ cOiVcӶ f00gHP|۝Wͫuy9͋$P߆rh[pL|us\;Knybآ]~.o Jr}ŻY&g,~ynz:Lw?'5*vSb]62&?i8F||9oenPUuFEoMCPH/x<iathIKplEE/TE^#T!"5CWd3K|N(g:BR:i+c3tEpchS:]7DzLWGHWNZk[*vj ]Z{ \ #]ynI]vAB^]JtutRP6vEW]_"N3]!]EeP ]E?oO+BiaC/ũP;KW[U=%Z(a=2SLWzW!+ "F:]ʍV+P 5;s0)lMq%UI};fhhNӄRciFɽ;pKTy8gy'tBqֺL =6 %+c+&>ozlKj|`{cll^˧2Js"d1nC5J3t&ݠnFLJ^ؐ(`cm3zӊ($JϢE1"J]N k2BWwFetLWGHWX~Ct֚ff h/JA7pF[+Bk̡tCpt-=!V ιQW|1H(f:B V^4CW74V-[Ԗ*b* pZ+Bkܡt|ˡ+a+{};a~{?EWۡV -J2]=vҊ վ%=?nK:]J 7-z"6ZilnOh phGU}{Bk?iP*4}41m׮͛sk`qiD r<а-XG{kHioq͈?6C@)dwxobqк_R$A0]!]  plp)D`:Br+UKKX* ]\L1HhC8tJ% ҕ7llU]"g._gJAsWч3B7^C'7r xq2oW?GxE豳,u\j_{+袿?F?Pwi9yYM>'j^N>jo~:^NZҤqY @ɭimyi_Vq4FˎBtC:_O&CBR&ќrZqzz'*=H= ްg & J84nrV߸IXt96|fkڮ2:N_t_8=C AŖ*K:4#H5B$+cj!*[: Rfj ]==xK2]!]k+?uͨ+B~(72]NWCNEџv{/k~j;aO` n+tءG)lt6CWWVZwtE(f:FRQlZYzr"ݹ㽬*Dnn䜎<@nS~4mF(NFZx\Cb͈aT+bК c1|bXK6NܯkSo:t]}FS}hN6n4Y kI/N~ybr5-Ib]ɢb6V6N!ѹz)j|}vuq[ϟ+zKpyUQVn^a= 77QXa#=bZvȈG @gBb{Ȟruq=h?~u+ߏ)~ٟחɗn~~v&5f'KI}RG"ou2X]6\/1btj rT߮;od+<ޚS!)YJ:I .k..ޅu} }"*ND)}oMoZdNhPPи{Fl?~ynKK`:7QC3]W _'yHfXKW|8a~@X7Fi0@_nקUtxk;~O3߿yyzӗ~..8UhR\_KݗVX$a1I$*Ft)Nle/uz[QܝeͩC` <;dbVzHWKUQ)9G="+XJ4ϒJ+D WyW 5!T.g5`aYjq l̫|i6}\zH 5e)RubQ ~&G&TdTDAdOK_J_xzÐb;L>ۃ6֠tbraC^xpt\F Yj=UjI\'wjX>nթtT*w*:Om=߹t6]vu%F/ [C&ho &~(ΨPUIٻ6WT|!9eX2ܜr/Tc8Y 'Q|{-6d7Fs T`U=ld)U~YK} Oί*/w*K?orE|&SOO\[Jտih,֯zoNoȃFnr#^,_6eӴ`ӎA_hߞgd&e8]>feI?Cډ ?(A xv9RI;W9 nCÉ!{dj<:g.ÞK4KI#JIkłFP J@IlX\вv> SWhkZ8|Qչ,TA 82t)H >([MghMVn_/0zX޸Rz+7#|%Y+ b>gGѳQ>[(t ܴty5ܞtt9ި۪XY]r!aC< LvIDadt|fSbӵjՖIilJ1Zmk[H^M۰)v>׿NBj5^>IZ_xzr%絉 w''?-7y'mtu%d}ܪq`]i~EumX32֦8pt3-]߮kَۜwwHVpᵪlqfkL zZIo{0wn|w2S: DV&+Nsڎ' Eϊ={dXQEٕdB BJBM`^;ۆsoS*k0MANJU]}&TB**UJ\ȚEnOFC3OMl -=q.v_me:֛/[^hT}ҔPGf@:7T:٘<Tt[.g6: u +f4K%-w :D];߼IگUo.snY3~8A8vO=.E)S& eZK%*#9Lʄ3Npzl)Ό'eI-"fm1m"\*X|)m+66)W SEØE\DZUgc]A \ɤG@Q (@r`ǃeF: xW^4ҧcE[=b9TRL&{3eqeEJVXש| Z?Sle: g[j>  = ]S5  &7TkW Qah;Wv̼[3OcO5&Pʢ2Pv2s2b Nhf,t0Juo-&"FWc.Ֆ,&y`exJyw7qvgA&oGÿoI 5OnPߺ`d\rAL49FȦ0 bN }|QneV^GjpDm,ˈ~j{PItwzy"os]WPȋtS-jv`('#J0 S%1/SБ¡v:^עOL8FC~_^ouW(5Q>o$vXi>b{qI1g ^hleded.7RƑgMsP`f.,a,=x'yٮ1,Oi\6T.кר M0MT!1ŘKjG͙Z~Xmk.]E|$t R80y\73P+{Mszl!arCC :VcñV󲠈T yks@.Se ^h0%W^de5*:#& J4j30r+!bj%A89;G!$ nny /O[]ϭIBd_zzѯ2J@jNurN UMrokuz+UeNLcU_+gҙݼwKVjm/{cmaH>ɟbT#k;P-W5it\ՈjQԔYA161ĔU&иbT20ڕkʌ߶z2/7B8`wWS;t?*OeWgU٢?3z ^LR.F0ֶ#;eآÜ t6 z}%+5F7=iWs̘<'}J-pɉjfk$#bGޢU%{ۂ7⬟Uݖt{ׯݽ: S,c`c a`C|5`&482t\ ƃIF%fHD=v(1kQ#5 #aD2P!@PB5x\88#3nZ9ɵR;8ﻕ,\b: i͠m.$8iaTUJ U[ `Eeh7q$1 Io~ivO4Ԑ qZKC1Pi]&PP0CfYsIcbk BNeY{]V^;^FCI0AJAEK݆l<{e.ZFBeu kZxM* 8Pt..'|3?lS;^Ϳ_ηZ"GؒAs5$ΣTP"V~B6 -VWC'?Ui,PhV H^(ys 怽ׄ.I;FscvQ& #DY+@K('KHl($Z8^<R>pcv-^L."gLT[JO4 #~=cyYb11:iS~QnT@NZIVBEYբ"6hq2E//9Ҏ'n= oZ+XΑ;ob-A# Jُ>.BLѪKퟝC7% 5U8D@NdJTRFFF!6ruK(xkbi.X ` Z.jA9D7Ǻ/SQ 9}8X]?ˤz\#JHtpI)grD ٱʭ{vYiaT/dgtB4aY.j@8J1٬[,@JJu}9)_WOioƺBr6:%(޵6n$B&K0;3HYdîa,K(翟j %ZmiL[WU[j VF= E3OW}zK!#!kK`*Dh, `1רuu[R ۅaiAybr 4 $3;A=N[j3B4gl"H֦ES'ӌ5?.,B6Yga#ZCT}Nl0?`r/ N>ˇۏ'>9D|Ǔ/?~/`S8hb'`O{U/^Y&OꫫZ.[]uz5a]6{ .ف[l8뇑4(#,wMЈ p%q=_a_ U$J~͕DXwKq:(r/@/7~N'P9#MՏnGHxR챎_S)u@Nȵ-pA@=qב^P\=ym/@f}3F9h^8:X**&Ob͙Nf]q`lNlD:vt6SNvDF/;ކU($.s:.bq[a^Yk҅IQN)93Z)}ಧ/jNVƹw`v]]GH%~Md PY~Dә%{`r c`*B YȖrKoC(,K5> n \JAxF4}t>_M$|x=%nBMm;1N;v^Zkqې ie)f#AZP**gɰ~,oyS#6c\ܳeTRsMdJ䀖yjrŕcM]Û\SsFNB{M D}v[=}& ^A İ7# MW(L(*8j4iH ˆ>(ǠJY+냉KM-Hʐj Xy$RD;LtNKO7妴KYILUP UdV['uv~#`} =*κ6ߗ# 3B/}2Q*R?8.ȢeJbKgi.WӸ `x`F{Td. T:0Q")$1LQHJƂsm.P,Yydc.1 RR`qJ7֝b΂wDðqc;zK%slKCzUwgJl7s̕Uk5jh*i߸!V(S;C1Z`qBXR+RFL 3|/zEO/y9]Hjb( Ȅ%O"xʙ1r,= |t=F$n0;WztOpUF̧FH0g(!( :h$ldYb -l[pH`hSE]kvŧ=.xZU$Aw-,( hضx>*2PS*l B+^?^oq}e`2u`D$b|ewbW$"eh*58Fզ^}uũ*ԑ 0x2RUOt=gq:Ȓh 9LŇ9{#JJ*qY/ޠD O /W]E\%nuqTvm9Q}uq(1c,Ǯ +ǭzr)qE!h/zL%Rd0qUr_pK0uq$WoP\V'jTRoAUS9/FGǕXaA<:`G! WoMdzyQ999bGإ*jfH}BaDވ.T}#qz1&4žQr3rB$]V~_COŻ*{1+y`aӃ@`^/lfTm{nl^U O0Qbff<##ˍaKlDʊbQhLcu՘gY{8+|I|lH8p3)L ,_<(J#jQHlzwuUh4BkM P4:ڀmTKIw`;q/tvsF;3lmϞ-})J Sů<',9RB<--G_',מM/)^W@X+7"3)ۉUSD0zMf28ך+qo:ĿGGG"4DIKNEڵcEqOy ~>y<UX-z^ƌL"^ˈ)d"hnD=NV}L~t'Q>2tI],z5xM`% ^Ӥ.,bHGf@^Ӡtp2j`H&/,Qoӿ|0a&rroD:5]xaz˚!:vF'Ͻctmn ~|S2&sP7rgRMhH7ϟKGRCRɈ+s:ˀ$2HQi@#od>Z7X/S\-I6yF/cLBo:*Gf& {}e!YAT0Jk[^&˸ca6.\}ɻyLf'bKl'I$rN$3 b!]R$8WQ}p*lCr@0P&'yIEJGt`ZD"rR:%My,R7Wj6`nLe@7p4. #,R=D'S G0UF*X1,Cv4H5> 'H 0AF/rTiNa tꛎI0Mc_H{tD깁%# *8g)p(({̽p\s符θ@PbM #$$8pFXʘ ",ab@HX(}W y fcuF%E޳\d\V'[8сOe* 9.cY@¸ PP\<\<Ϲ/8yUVfXu  |9kLw' O0a\]FbKǞC4uE3|ΈPWHR5`G]J>@=4lT;E&A sG1 \R=3ZH * AY+7_Iӡ.Sx!9-΃zc)TJ(Q\`) Zip$E%**@ tY}NTxGDJNӑȲ̀j֌;d*p0AP8ݕcU4To\7jfB;D%X`2O|˪a˻"|tG dSDFs4:RLEQx1 ן5+UE88lBPpTa3U5Kt5ؽ/OegŻ43>p'u67)Gmz9ӓV3ͨJvF(oi ZK)dƌYw٭5գjx e^^._Z76 B%2x'jo۵H6MLB|z o0!631yiH4 i։ 0Y)`.&->N.n_[i6lm XZ{hx$Ḿ|~-6ǗEf1Kbp6PEhml<>x~5_}7/o71Q_}_v`(Q ?_{ܛ~>䖍de2o!f_xtEe4[¸ ?F4@FYv۰ɥUmG_ŢIgKd1ْ-ь|GP>,zN^o+Z3֪s iMy_YZL"Q5넰z>լZ?jր-ثY',YլDŽ?cXG9L0˗$#j/96$Qs-9˝R 3|?u {X^N/pm<3EyczcR̦/.f2k{qP,?I-giI]r@cߕQ1E:+ N64<]+ Yi7o//@YE7q/WE:QJclBx5N a'{QB(kB.00Z\3ruT:a-QuDAx?:Sꚢlgeɮ# :y,![n5د< Vq[Ƣc\g8[;.fStށRV:-<`\+epo~UWή<2Dڨr.^z|U Q,yq(y2`:Yf0^/do{]|EG(EF,~wrqQRe+'^`c4y;;,pH!T@'~/G٩5Evp')t 2i>۵蛼Hd*06" hH;xTV[ީʰȅwk(ΪRUF+ЫL^tW}o~# n:֐cMǵ+ʫ6}udIqY+VO .Q?F0:(55zot.ަF=@4JR;..W< k67Ba}ָ:w뎞t{H(;f[AV1’y-aoɥB flBGT EOJNKaW*~yBBNEk:")V[GvhY Z|2/r-a{Z]ka{VԊ$ܤ[Bg#Cs]?ΫvN@WcvJY9lwm{AxͶk>,f]hS[Y؛pQU:z i6ov}i,o3LO8*a]znE]kBi7%ǕgE orM5Yq$q@hUԿ0Q")$ a*ő:=cS"U0`"{&CDTI)prS"!wK9 Q ƑCo S 6KI{(/U;mV;ގQ WlYe=ӫ]|悇XNY LkօT,V".$!&QGs?V3H^II[dBԒ`zʙ1rx>G$sݸI;mtOpRz >*2P1VЊxT!n׆X\-2QwD]ɝ*#(Fe 6%/ՍF7C쿮3_Get$ "Ip ozN1_kRQ9/MTbv8|՜N,aQDy]b-P "M$H~MKmmhHtMAdfqm9ܞwCG;t[ 6T\oVuBC\&EX6A\wY'Zo&4֘[p&2>mmh9q~VmGۺR|cQjUMgפ6zH߷eD~%yEk~?fxDaN9wG.z뿷t|[8o٘LP;o]ˁW|1f_x`7s >{0㻿3ꋊLWMɭo~C,kN~B7jS\9dy4\mꍕB{Bٻ6$W:e7YGƮqױˬCQD(YM F&a4~Ue̦(&4RxP-` ļV81=4 /LO;[?B--"48!!/d w(٫>cԊ E(xIZ9M.$AQA nQe)*#4)QB#4F}( -F.:rB cE0UV*NըmWzo "4sh"94kiy/g K@|>eC;׾*!V)(II;Nhh%!HZYrJND,5Y[ZNR`l" @e\fpR (ɩ9YIvfI\1Τ!r02KYBwMt1*-lgY* l1@[o:S} !PƴA@HH*FY[\V-2΅9$ 1 L$xZĜS)m ϡp ' >).}Ii 7Q{Pv?PFgq-IIG^&Q+Q% PRj"ȝ "tJ9 Rpc)yKʌۖn.$C.yT||G~.;ƏB⎋%W܇߽Wuo^ޏG+2Rqe[dYլxaF+cˑIOP$>`|urK5]8*+t>tk{AK5K|VIy7e{xR%\]5_N叼8W;:/Ɠ[j}%';|VhMMBlR$^]T$,?G¬S,9:)Q"E :}:m53߆{凅k7"sÜnuZBV~Q9bvEȊrXtj1cdd4)>xO,y&]`\~\h,>'K˓b._ͻ>%;Lo{[9$GJhy8))tvFmduv0]rBsE3T@0>F% \R!|~PP<ƄvKPnɸcS)0|nu^_<”"ZĒ y .e@] 6FO#"$.2[>z;6웷YqKvd2Tl}v>};Li:F m vxq7~|q1U9٩,({ HCQ?*[^ZՀw'8j{=:psbnF6?{p'ADJι+YRӦ*W<\N} ƫS0W8~1XA+øpfߎp$'nKeҔz/AsebI(P;gVDr3[7mz(&aM?ӢȎW/DMݯ(${@7/TZo]Esaunђ]@Խv! P+]4Ū`aa"<ݨUmEhk-g'ggȼ-Rˤ4BX(dHky㝦85,`>=6ݡcuX#,XAVR8)֪@ƅJƉ yF븶8(yP&YfkaC@.P፧H&"4 DK VH[^j Ps^R"lrRkϝD lGByO:%-ي,B ka{Z W7S= 91 2ٗ@-g(2`f %$gGtkB@h_*eu(:@ʕe."`MCW. ]ia]ROt[BFUD_*5te$# tl z8&5{}Fp%fhٞ6C [t tմT6{CW4}+*uNq(qjT[άpB^^Yem%Zlz ~b2}4<ι-w @|4|y M__W %p/vrQՔף%Ζ_X옏x[aQcDm2giP^3E Lշhy-98#p('aM?|?o^|Ɯgؒ_2mӳ)ɡg&YNamFaD)JlXkp }3ZNW%ȁ@SE k.zCW}+Dk(:]e<]] JI]!` BUF#UFٵ]]= ]I)@ ]5HW.U}w~IF)`J"e!`E$彡VҮUF}'HW>2\ޛVt~=T:D2F'M-]e?`kzbtQ0|J8z_{J#FPvm]l@Wb]/)#B] t_*UtQ2!OtfJ~>XUEՅ\}fjv+*UhWv67ɘ%'֪IM'Z־tZE7ݹN팭XG3pyDYu$U Z!G):5c'Ō!'/$*RfkQ$ !CQsJI7L9Fg"3ZDYi4HÑ@v F UF u(%JPhU2\CBWVUF:Dʉn&7t2hu*; JkEo*{6:ix|k{1¦'9YTrʔ 3yFJ-׶4DWOT>zbUAx|Y|^sK\UvCOmN!l ehf򣽲cԯe|%k7'g8 G%K4l&A nQ)*#4)QB#{o޲6KVh4 mԑh\bS,(8RqF]Y{p9ߵR֏^~No[_G٤Q{Cr/ѧlp ;w}_MihCyD~㷮"3WʓoFYĤr _`'R4TNaI$,XPGuJ%'Zۚ.69uɂU+gMR'QTCMPR@`4E I8L1vt?\ׁ8Ɗ) ^Prd.B_TbOxr VDv!uёmk_+eW< R8z[q z1E|`|G66+aq)uMA*ݞNMABAVDef4S~b4wH~)fUʙGj3m\m2,(vhUcApCvڮradU| fbdcպZsc,5͜? '<45Yr. 3V,Nx BiUNU{"Q|OJ\bJOttp9 7_VaiO4+JP y4qR⫖1h)jP=ANX|L(|}g,C˱d'uM-ВpwgJ7t@p64:$T1lRr :WyKtz_|./sάN3t`fT˓!;qt+҈VNiPQ`,UQ5l8$ NiZΰ9(ea9m#ai5sY JUP4=Utn2+Ts&l5A]7suzDkSDF}VC1teM9|\6wc gb1ݻ l=|>~d:?g1\S kGB3EW b^ ԹF]\Ja.. 5JCv9 8 8QayJMXs\ty<;8[ vKnBzA/ۤdb!XW]htzI͉X|&o)BR;R ^qfH4iv32BǔVQ>I+ݒ/:|Ƴ2&PoNNn~Zn*goճٞVpso;\bZ؜6 ƁǤy s~`&盦?v|q F.*{pܭ <@hA״j38W#̍HFK5+QIWN魽z3cwOorBU86<0![yё>n>h~!_)G r-kvzvDXL5(g1A^;MX.YyblH@55Q!3[T:٘<p"d.-J_lt.:l +oݚU>d)d8meuM?.9kT8l$ݳgg*c Sk,W-GB5r:8c_(lRvxҀQjc2D@92'bT+O\xڅPOE_}ybOƮ;VFj&7MbžӾ,G?sVFVFr㽥L6hax60kfr` FяwGWޝGރLBjnRsfoTG 2&i 6"~pOg-?hKhHӁ/eԓlB73&{]-{)!arCC :VcQeq,gVTIPEֹq )2/4$+8jɦkVuF1j].L*Ҩȫ&d@Scpr1w t3M@I6 yl;fg_͟6]ϽEBd]>SFWZmϓБJW$u H6s-R.u*V-A֣#ulr6>=\.hPX|˜t:08 $?1AU}Qґ=bp65`*V9騃`Q^'45jIkpΔ}ܯ(4& 6Ά 陗HlnP…WW=/j~!Ҷ{<2bj&r&z5wtMa|GYkֲ7ױ'uAO$Fm88L GǔW))FM(r9L̵+B6WF\es:+U:అJ~9%:4بn %-z|wqJ\*nS{/>vϕZb{^drRȻ Be"kJEFw}#| 欢C"U+mlűV䵷' ;2v3nX[,3 uXIcEmˏ Q:Jݿ6tuQֳ&o N<#v ?{ƍʔ_69'#h\U凬\=8}ظTJ,QV,+oc82ԈY=Jl ?t7-iшh`FB&<kH b`%B3軍sӐ8.C6pǀetr'yPd&1n"72W8yxVL]AθcSf6Q`8MB$e"(Eu'CH"]Kr=!hyN0iA9ZȨRVkB$^ȴ8"F/ HhTGe2120 "v""t#bk-7$ZdQG5 yaOE>N3୏ E87)!Rq@ (hV3. q<(BO9GU{Pp9#iĭv]uv%y=.Bʔ<9MH 'k-6WFrX+8ŝ] _=-Z7LyuF> ܋k}\։hG?PF)h^%bcG!M=IܑPr&u),h'x|NjC 4;h+6)oI5 qC҂B  X: <ˬSZt/д iU$ÆmգCIi 9' x#Y,ɍ"M9&ɑ .bcnl@ϔ,EKGivrGrHEADOM=Ug/IYM-^ZX۰ @tQ52mPIw,H' ETe1!ga<P1CVH>2b(&QYOZT@&[hPJԭ mX$FRHA$*r,K+,RF) ԓXDlZ䴹8uv8ϲ5yura}!N?g~aI=\|krJ@P@vF .LLqhZzU 7ZOHt6'Byt|\C- 7n\_>GL>bCaq+Y).l"Uj^ZM|tt8=ṤIs9eŁo ojH F.㿯8%K5* a}qkNubw}586-3T3S<ǬhqȎC8:~?oSoP8h"$|57o~i?l9474o1]κ^5r]Ma{g+cvGn%@_>~; &zU#_,OMYp= EF1?XL _wQQmzJy DbfxI{ҷ)wmmI6݃{l#Eԯh.ymIƠfrv_E?QCi/H0vѽ!3]H mƤ1:A #+Vdt\r#U_ĸ;I^]JSgP\4_`^7jLd;]Y&t !-cquws?Ų̔_ C.*O㨘b|@^bo"W(߽{7mloף&5.fTɥzxOLoBMs{[@2emoTY9hoU:MK>ňR45&DsŰ:&("+9P%xfVD4蕮Cm_-jUV=V=~=eQ{G5JB*h"W~ W~ܼlQT~Vm۲s!l;$xSC.U:Y4ɵ ]_Z/-%(⩤Z$̿!2(b9)Sr)CeJ 5{Y3S==ԧR-}nuwimD%m2g 0)/lw{=ӓ$)uq9GG:]y_ͪClYby ݮ[{=/?ymyz~!>Rv96<*w-;㺸Ნ3f35A ֜O޴mbvOyj/)lժzW sHyF+|?VW0gXRj:- Zrs"+,`o5;ɮY꫾'gyoCWkнgB'8 9AaZBξLq-40Y ƙ %9) v!AL&ʌdBDzMt茜Kg}yq9xSJTtXs@cP7 ?^LkguOˁ)ώK_Ҵ+;NP vÝA8 "*2Ϡ ,4 'ǵGgA5 CtV}(AؤR ˱mf::@@ BnkRHW1@C~\5\&(H Q1x`rj=:#q`zx;ͮ&v?{[弜3x@1$$ sb:QN!YrFʒ4 /F糳k.QJ1Mؘ*81L2!%AGC $߫< OV׭f?T{qȱRo+s(u%a;)Io F--K*[RߙdJXo 2VBqcD=I)ڲ '#} \7~Pǝ}f'1 MU! P94 (nɅBL,CQ xo6n$&K1BvC$pT2$͂Ϥ\\!U>LLȚ̅$?᝜LVT*5y˚xGӦ@~Sݪ)_4~:1/.f돪1O^|E&_r|^ќr_w'Yx99L6ZBM-O!i6aeCY?hN߼Ŧpۍ؇c6c(V;vf{Gnu+1P64H~pȷ:7m& VMƊpK!W o?nHV_:84v*Zm|${݋P^wRAtOSt$&j#A[wfg:gZcLReaZ}ij _/:7-̻x2tX13]x zȦzGi?xkGJ:%MpDRiBY/q[<13N"Mgd"q:+/xv NdHH1Qtt5$ՖJK) 3>5'؃^'9Чrj[tֶX}Em6u2\ej%uT*՗Wp|RahŅd+_409l 4:dt&iGqۜ9P5ۗm' jID20Eկ?UwլK;9w/#l8IQGX籠_Ij#/_jYUw_G~ 3+nryA9 (y$<YzTZTXDDWs[a|lD3[A]!-3O5JQBux. Tj bA1c!Y!ip@!:}hD ruEm1YmQȄQI)x:#e;W6zr.fJW~o*C8]ALJXS<1csF`65p3e抓($i% aЩ5FY#KpC Jk@"Ӊꀒ^tՌ;#Rp^XvJX\hSH8qQġ%CT# 1kCQҠrlzaya>پL|B_m+\ ۺBy=02! cn B@brMg&Uq[˼̡UU52!VɆ[dlVccYz41]LJu[^|Xlbs2+zgXU߳?S2`ΰ71 6џZE9/cfq{@!y.3(1djf%d*}q3HڰՄj'Dz0u}&-bS}ʰ-6';͕/hP7LN=tCψtA>tuzOsy>`skyX KA̜yE]zԻՔը̾ KiSN+NT%8CF&`SV89i\h~IFS>K43bfIFԱdFۓ2m+pf 7z\6$Ġ$cgQ|STK) `+{ERdaVe1Jĵ9m B/GMvR׭ͽG hquzPGjYZ*djj!Z][pz*1$JJIQd7V;k Pz, "yկ쪣 b`1լdr3P;#GfcUj.q \vIb#kMjI$& e?Ά@;[Տ6e(j;3,[u.E_#7[c!1% ]֬j0mリLE OrQȥ@߶­|qYETߟ4\8K*\5 j“Jr=mNXmK  ҪBľ/B 585KhN-x?~HfOɂ1oDbREX"*I k"}= SCߦQ/FYmQo7Ol? lz&SbozZcy\OϾ:iȡ[Gx_W=2%5ir%7er1Ǘ_=ZJ @{"Vc\]; '}b8)qx]ܲkzbOr,OP#⫊g: R·+菘}.o>Y Cm#r#{=@@ ͖^I*&!)Ʒُo`MaZY:}?ګDž9#!/Kzg;McbBxUִQxuG« 1!׈ӏjENũ!҂* $C1K } ęֳgbẘѪ)%X("YHwbSPܖ*@a]Bn)VH6*|كS2:'T *$j&y`KB<}6ZOZӏT?(P31H pU[Ol<.Ǽq$vXx,5R<&$i,w ls xjEjXKfJuI^:\L=^h B'p y)WDz26NS 1<}f@ d'&`J h/Q=̀>/"_к^^'A-}J!"&WV|MU 眢oJϿ9o}+w~Q*&8k/_]/zz1:wn>߽_>trdG~v_57?G{mwLz-ΔN?C~~q~n~[ ߗ݊~z/zKdPKy_ʈrUl4N T qQRZ>;{@\Z ɣZ]lQIQrl9m7w̶DFs% aHә#p l-_;) ~^_Wӏ|)DkWB%ESF ^M.qwx|{~Uu7k#ahKzϽD}] +  f*Hׇ2KfSA,SAT>R#?Λ;==tvуjcur>I8`@'L 7dS /0YZ^g[1 ]dE%FQ"m+7jM։]+ws5W,ee6"%O\WCpiLtS6 %)gSM➃B6 eݤgD(,S*D4 18Ŝ%&o[Fcp+H|P863\\2$TVX2]뚩5f'eO9as|.:0P)*CRfJ^RkJN5c2K!?B- 5|)j$lyP!rއY'F%#F\}NvZo8U'/7;͔?/6#qetW޿\VIǐ sV>BՕl ,-D1ѐIKqi~bbzU?TW#bm.qj (JE jF{a0_8L3v//|R_x-Q(}<^N?hm}W&w? &:>>v|=csd,ʤzV&)%blHP6 DW\ M_l|} cЯ(λ6=PjtF˛,41s%~]NOŜQqv[`mk՛J!ŋ$#bO5i>)ɭ6E 0`A"C ʢKT_S0:qQ(=@xTK{פ~Sl&ҏC=I/q27e_L0'WpRkPJȁd(d(V9<1A6+PTZٺ)UAAU`Szh{#G_\MW5aZr_<<x/7 ck9ir#"Um2dQHZS0Q1x`R}.:F] |9 g$-ُ/7nxc8>]xt] .c BjS"ߛzi~{=e NtnnIIǾePķ@Q=c7ٴa}D2U} ]ER2>b`p52Q<`jZr3JWtlKrzU_9ˮ:9t3>! R1vj՛de WHE)e4 ̦w)*u:hɅ-XL/)3JA 9*F,5fj6u4i&s| Rjuٚ-x2X5+9H3=$NQ} 6˷4T\ޅP7e\M Vc[\MSCH5ɥzsRU7^Z1[Vg)5qTlE]tzgnS* -aO\Heعp0+mMC=w,xbвBˁ>(~L!z3O!|hV} mʔ`T$T [E/_^vâ|YSϊ*-Tsr[I)d7#~Q|$USqRaR*qE IVoc8H#qh\e1`  D5Av 65*C Bt@Hm'gSF^.C?B )o^7eFIREY ۫%<QJxC2`")OTLW[VYP'hMs!pQޚ;`9X#(L VX |5X|*|?nѿ[u$Ĝ(~paEb ':_o"WE tOPyAV{*Ѥ'*j` jůi7 4EEHW1 `f-^ ]?pܦIꃷ# &ڤSOz@Vb5&L<z)' .)-XR&B bpL4O ^IHSxZ8Vk5dw:iWbѼW84tM?{ؘR\o)H8"R01]LT>3,j§xTڃfk8f+5-)`79)ȶ%MGٻTzr hOa&pu=fm9>Dsӟս-d.+g!5[NqDk *%&~XXB&?.e7+4^r˴MV$ʷIx&0*,&|j-,.X,A*ԙOiZj@Bؿ<>~nI0jjFA_33 q(z*g$j'tUkI0mL:A>`}bwɒ\j_.ڗ7 󔆿fb_옅p9bC gLx3nkmhp,9ݳD jI`N y2OŤΗO'}s2|Fɲiylb|zfYx)9aRho( %IoVa%Qh"qJZmAy'= __`gG Dhų{>3 {eGoctHE-LHb0`J 9YT%ȗU*b]#^!I]/Ty%vw8zpdu{$@&X (aN8KG\H * >c̦\]smp~"rMI5tZ* ^S !b Zic *p` do]Cwm?,JHFǤ"Vؚ5N+j , (O+@25 枺ŨG< lO+^C4X` La QtK ؈' HD\E?b߿Z2D!ƈ<2 Vq4ls=-K)Bv!=iAy"s QӖP`^n3lE4e?,ь52yYP'?¤cg~RE=?ǍG .+fIh+xqʏ$ ?7~P>Jt3qB#eGX#dSDѹbxlh<xu TExq85޳!jv5dήy빳i*yhR$L8 wŵHբzmog?-N=99\]+ZbӠU ȵj]?GBQ|h }уKPc{c[h ~~~cŻEel3 3se>;-Ջ]:)/7)|ra!ԓ=1nHc7 nCHqa:_7Uofp^ ZlM65"ytfk$uA|~\"(Hjb8=cGDAk ߧzׅ]_8 |?ٻ0Qg?=;q30 .&lbYu+6sҾee߬kXroW}|`Y(ʙ[- J?|~7.E*fkGKWR_#u@7.\MApyŕDX-w+t1fx`~g9;Ƃ>>"#)R:j F EGF|Q8A@=q4H_ajk{lD%F0bQ%Oui^8|IQ,jL )N'w:{dԙL<3Ȉn3EBև<t YaV尝5a;rwa;yiΊK\IkE IQI)%3ZXN`L4+Zi첦GYzrE49I6F=g`\҅v1T)]WW逰 OjRz\TknCXeM^܇r1zrŢ"NEh6ȔCea+->^> ? ,cb4Xޖdz{PNS}YW1~qL|ג6}o Պ7li%1 >aN^@ee >[^@Ӹ\*J \K= u kF]%v:u%GJ\R+TW@dWd.fф yEoO%k$ O8=AO;%̪lgx#%B𥧟?nqLziY% ~ [Y[,uMc4rrހmFTnF<@` ƘcdtݏOJ20t"9'} m+Ɨ >-ړW#xqVrJƼ.t +XrT8z:y_󗬽Y[FH(򥖜N)bxЄ.R2z ĮЇb&nwXg qRW0E`UbCQW[]]%.%+d]1pᨫĮ[徫%7 Q]q a~@*+ĭ{ 8[WQ] fa+`WQW[B8W$L/JB&UAs[\8%|Y1&%\]TMgah)5ոd\+d*Uͪ3ێ`vTVM24'JÔIEOOwR}*zr -Tts/Wϣ|;nuX ײLǥTY&en7iw/{(u]-s%FRJje%LAK<BNEk:")VN1d&#QHYu2LwZ DSxcԀG*%xig4{TTG &ʡmJ]wcRRdz[oƓpU&f7 V4[oH׻p{IEfqJZ3]*6o[O:QT#`\ҕ%.u5Ժ^Os0@tfCtϮ͊ ,eYKk{Ekg奖a47v}bmdv {u5͉QQ lmW66륧CC]ƫqT|m8pC:c\Q%*-๶zeOuzonvG'5ݟL=(%!P=79d `ehB_D2 ٻ6$W?.\VXlh2&)wۃY<$ѢDQIlC**"2FGTȄu4&4ĘK נ,I J@]$faHƑfŐz4(Ҏh9b}ݨi:p XAλ/ii,>E+?zh@Y/0)XW5DYw^AINQ#E#T^^F !h/0HJ-r P{Us@jTX$$eB^Zck6 2"@v|vN3q`&((]JNZQjsbG pϜ%8g$owzWOMQ W~<ӏ]1;O|bţEH>a)h {A%Tj 3)$ttXgJb(y6yT-7MxII>Wɗ%릊':SgmkdR΂TF$ƚ7}T퓽踴5Z>No|>pKF- ojzI^x=hk5].jViObtb/Apyu-[Н _ĭ_ƝOܵ#YE7B^f>wY6[mڏ%7(Ͼx6bWozj zZ>)hI*kNӧkxY^_+_4⧾%2rύ'ԟde?SRJ'6?YϾ~E߅wZX.$2Y_~z|{em~9LoX4ܴo-SG @=qL]K[Y+_㥫/Y*^rw=?2Mx1soLʥz{Oͫ׬I p1;ѫ LҪf] U{G-f|M޿ge)K6YU,U,|ڳg}N5"b1FdȬrۓ UQӸ-v11E*֡` ɚ$QD!x 裕T4FV^&P[߁L qEUbcԤ B`ʧ:TdH2*&Jr`ݠfk4gk\/-k1[cuF5ąU_qTk'TmW@*[{lA rRMH*(Czw1kW[ jԵ Zujf0Q Ude )f %yU(KȬmAFm&\tɢcI%ej+^ǂ r DiZMz[+rUi,m2)1߶oq5=4=ޘdo/oQ5CZKf j J "Z YPY:7ꜭ%(uQ4QrIh>SeX4WY4ZvygtȜM, )HQY' s^h"6 Kqhk@Y3ruzִځLV)6FT  %29 0T!b$2 LRq["8JQ5R𘲳ep)FPN1ΓҨyPcݏl.M#7JOڃ^䲴, &vQ"jƗ9Nbr%ybTMǰZā[{1Ͷ~<c$MmGrQS8md]6¬Ml(wZSK =5cC oJG^\k%yǏ&P_ l kD vRefFh'01h!Ji,{ ^ΫaMl /d (ҝ (=i'넄1䙒Mг"mlJ]}NvhPlu_CE G1}GLEaCIE9"M8Yl jbք5j"˜X&Fi6%1m񩍉et'F ;Eؤ켤E9{h![Fs7S*{F P11s!!8$qC4& ,"X Y2e>x}$%i0"&\>6#үp74M JXb_&J 0G.9Kd(5}I%i=Ga9Jeه } (3aU\H%6,yF)Y [G*Y(J_Wtw}K1l C8Co[>C_uGxf =S`Σ\s.F֗)x3yq1n)nk_?XF%?sI*NO"X9<:?tVfuQj O~鯏ֻRM(*\,X.zlm$K߀eYon>~2*| XBmfxmKss7庇ؾڐ wz=yyi,ozu旾- >;YvM,.gR/&~5?XՎj sCث󏬗gcF DRR˨buAOXgW!yXF"ۂ"b$l9Uq "xa0Y .Ҩ)@t mFإ fAwHK@mW-#b"YԈGȹe m`#.}%]5OdfۯҐ;^2u>Fy5C 1Aņv؛*c bFLxoW&oM.´tS u?@& r d"PEpn9~́,z@SC"Ea"!\3F"BFB,+ R%J40$M(Ԭ4&P A!-1Hg5([@H2 o dg,]4@Ӄ"OO~t2inq2\.l~m Y g$4BT_ ;Y*4I D3O=n[4ٶؠ9#naf7ECFkK]?`'3?\eaxBQܰ 8mjpyƧ7k?%VV.)7E*E_:mR%24֎XcU:犬kDi9n!Ա|_{yCx{fǽsuAmЭg1jA:*Ŷ\Y3yIR Q{Œ ds)`nǷkAǏc0K0{#6@;F\".Ws"WׁvEՁA0ڰW$ץg-b>[D  EvքB΀E B$,xTb퍒\&r, =趣! HNE|60w>)KADad5*cd}f܏giMw>9! QZe-P:2x<9+k2m/,u/!()Ñ&=$ E I>dy"xSH,` B l{$&5]lͱhME0&ɶ-IJTH|H-PjTY^8g2FQ‰1w-qHW $"f3^tfӛSb!)Kb!ESTeFspqddc2 9xNn&0vh?r8zгN0yDm'ˁi9\Ο>aB2^sogo5&W1!-U$aáeJ,jFyx jH[R\ɓJ6SWl̅0RlĉMQɣjk҃`<' L뗷' `-+|pQs H߼zp:2J@wa:br<癩___\Z%aE̽ $% $Q^͊:.UA}$Jo-7D$hXԍȁ$j76ALLWmeL2Wӂ>g]D/k&tA_8"yq#G)mj$W!1H5HM SF tQVM)R6cZ PSfj|ionOJ]n>އ :b,K8gG[)s&wR|ַЛŢUDs7Fv=oǘ85+ sC1XWMTPqt3Q'Ǘ}iw{{;^qqG{|WqLNE ҽ2O͗6|$/ -jJ&8 ۜxy2^Le(ެ"xU ]*8*{Xz_͛*JY; ,rvc^ [qc{?|[ɴ7 rZ6YTkR)5?=O'){ #E0>;+||v| 'ZJѸl8D%MlyM3vvYXm!2w6!g=zxx<-¥I$Qcg]ͪݝlr6\t\㳴WKgN+vq>`zߕ9;haJ6ғh/8?^A [2ksZ{[i8^NeTV~*FwAJ.%jr\jli#DV5%rAv4rk9Sp I n12c6q]0 ARqgTUqx)d.~U.C4v~(n\o7귫x|`NN?1c->IM &R`0 $m\MWk-hɓs\[-7.a@ li=pJRĪe)n$6֩h-w팝+=-:^f2kkg"ٚqSs]c9 GS\wg6Vݔ2prp,pP,p6FśfB(r<*0M-,lõ=g0x)#Ɍhx`'ܗ#kL\EgW}Դ]thK*@(D 7w[n LUSY>v MRPfWg;#~x0xL]Z%/E7^|աR}~ B%S\zJ -DFa@FZ8ŝqγcW7~ w]vuN9rj^\]D>Y7~tj>xK:gGMZQ|8:?ke0W'xiWe"._ۿ|I q?k-cwq|?J@76ĖZ܌;~9͝g#6Op|?MFUc+)W iEM8Q~挼Yh촲敤W1վ$bDdd>O p1 a ˃~5bXX'qojh}u(Á^#]jO] ] ~@HW S=+o22h]u(t JFOfPM]͢+ZowR9+@tYjp Z5;d:kϮ#[W]pw~UGlpdL*-,{H:mڸ/N_߀ʁ~;42/nW?yuOM?vqq|_갤T[u_DBg?Tq)v=Y?zvf}_QލzuGG79J"oG걽x$.U[S%rTT1/@$(O eaf^5ys;w͟3Sn[/39<=H9d$Pƅ[j2[G.XŻCb_8Y[!+mn8W61xO,m3Kރnm?p%y(-= 䬩(:'rVECy0\/PFSRY]ȩXKrqiʹRϴSyqXlvƢ~Yh@wKSE@PĘȱ"XT%:eNZ-hstYD+7D;hѵhr9?=}IK;Wj6kwL*tPRDXR#1bN",=X؍hk^"ZkU?^["^٭B*l8WV݃6\W ۜY1d*L2sPa39MA}4 4f5zٻB~C~:jB~-N) I4]Ү/NRi6Tg^<+:fDyKƜ9K,YOT訨ϛx̪Jmkj*uuREC*)k{s2%zsa&8I1я5`VCJQQVR!Huf\T6>W,|bvj^,}(:FsOj 37% ̚#ȧ ]S{A 1ȎhOaֆR3wi†"_uiGS+[c{5PQAqz< wԡmˊ#8#N %*Ir*|utA[,<[.] kBhcnuїq-YR%HJscvF {A1LJwՒmWHecXH ̆6!E6>q&X(\R}#Nm4xT2TPm LB2Y6@@;W+dW1w|*͐jPo]\?1l a]̷.a y&? s3xuqK˱~FA{׋qIX1,4p U$6unb:1"_t 0g|g A&cF'`*7,Yhu(P! \k sy0d5FNr3c\o0@ַ@|27A@x0foR9CtytCh+)a!k2HA(Hy T"Fi 'O rcd{j;a2M ' `qE bJSYfm"; CV'y \ةP E1!Wu;!VG_ ?gwynp-@ˎ9w rpt΀&<)Eza~ @ -é_}FqF˖BC~[{ 7s^yrv:àG*v,M_o}W%k߱/iC9ut:Oyʜm7;ۭipd|&?#.i/3q^VCսGVCqR$B*Ǻp(f?q(3~9t%Ú^Cvk3Zx pn'7D/ XCWWt%j~-c[,֦BjJH-<1TW\T—rDŽU5nRngf|:yTNJ5Qi_Kͻ |;ɋAa^-LdyMA&+< \5W}|%),bƺ#-/daT!56R)d$u̖#bJÀVpw1(&1bX:nvoipAtp*jDr}+՞R]`o ]!2sCSI=+gmw:j&J+D+zOW3]]_oVBBWھtDWHWktAtedrAwhOWT>ҕֱ!`U]!\_jG՗CWMc}j w;Z Lmgt7+NtئLq +Y1tpst(!C*a +U1tp5Ut(#G{0{$x-RƯLmPn˻/TWWw՝ MOڬ9C ҕAtηmR JwB,jp,+$wgnV~ 4j B@VCWdp]g3=;YeJIt.G]!\[@+;]!ʞULt2t!Ҵ傱bΓ&/JAG0mקmRyRjh P)mH^' w f0'W{{o°9 #ߏO.NS!ks~;W|* RA/G{J6~U@OO#aӺK ^~Js~mK@w6d=VvZ~q--7Kn>Npa|Ma/T{z@ Qꘂ8fL ؎gYR>4\F&_ws#NGQ,o8finyr|>j揻 N[s#(B~ʳ% P¶٠Uf 72W 3ǂ+㥞t[z7h-(h%3+'\rU-E g߲׉S"U~b{id?:#NN10L:ĺ&B0<7"$i*hO榮y))xQ9|V m BjnkZ9E)8 _+v 疴[n!9ϓb3v2쎖Y^{KL Xj_TX P~偢0/G8j\|d^|LsJZs>}^8?*TCoK}+jD!$uVA(^c@2&$} 9GB ɪX9[ %]PҴ ^`Kg*ҷ(~7-e-;&`PqOl#)1&R+KUG5ƛ&5QgV[JeV2?Lu^j&~L h#n]تtmWX#˳F2ϧiN@9sF߷;XLi܋Cn!WX:7W{:vkQz.wf64J7\vE UD~\kd)qaqeHq_d?Tyz|9:Ź[Ꟍ0reObYz7BmSil( ʱ`IYnwdn'w?ƞ%~$gȻ3WIaSm<*֚:g48!; iKJՑ[hw %jJX]rhd^*F%rp2s7-[eIbOɿqGn]Y~_7y]dO˵87VYy"d "iN"eBb"7 ʨh[p&y iu+K5 ͹>QzeV^{uxFmΨY-4ն6d[]tٓ3ՇF3e깜NpWzVWJ2\is;سe[sqF90]jqMbevmq򮮥269d&`,xQ 9dxICmsJFjKۈLRx8`Kirl7=Ih۲DԏE:JCj†0i3$T UBB'$At[̺!* !TF!=FaDɡFhBfl(K‚UD dLR32rAhJ}NJhk`M=U f[1[KaYXܤm^\C`U&=`P~csFمP_.KZזBE)m*< %Ȗߩx Ipz1d t+VŠ(Z,a2s6*~ \,r$ d7 xɃ]X}߲CR{bVp qUo4F[}\uF,D M`P#lA!aټElYh C4%g0Ē5lGd2pJĠz d")@ƅN,Bj 1A RX$Z}гW7r?\+$3V8aǯ_&G̨{3mnss3#E]bQgxT3"MA5`JE7@oD yjHd  )9P}Q$ik [ C& MXS IZ_HZK5*D:TU JS Q,*5V@NoD_jcDasJ;'̙3/Y'Xs^x(P} пc3Vn#'ӏ}ĉbi l:+h0I 0葒#㎄Cwt%GG!:1ʆ\$rNd(Fȸ(X}Z9\zECcKҰjwNvSjonbjl栫C 4248Ay1z [W-TTD>2P&SɦDY1{oਓ ޙ4y3 4] YBQ1^Z< U89%^l(]^|!qtM;|.˒ ƭ Ǟ z)JZkͱ+Et@bW+Ec@d?ixbFTԃK9%89&QZ ׏__(_ Q h%m4wlb9NKY s]?.U2&FEIs (!VY"RrH`|aؠLsoکn#ISx</luuE#ՈгFA#q֗a,STM5^dbƨv: AoX#)ȶq1`dT=d#-c7rk̏:^7+:&_go\P{֋8A/R>-M)BA(Ig,kb\J^4ЋGc9Ǣx9T&*|{/%Gmԉ8 n~`mqEhT1ZWؼ8k(PAxi* hT27kϔIɰAFQ:)v-.N]]3՚:1׫b忍e2~Rz "~oO;] ,{ ^;uWO?yV'}2P'Gjad`x~'Y}Mc*एBsBm:jCn6VUE%_Jv ]ީ·Jm>?_ms~tq~q+s#QI#PͰsjC35'-ﶷS̿_>qV#J.:7VcoGj=Z65~ٺWo=ӽ!Odlnv92Φ\-"o6;?]#Y9G:FuRt,ЦgZ:}Nc;{̍"nqlr4e<_֫Z/'fdx\WU4J0,l(hmJyjUb#=x) S9OkHo[;hl$R`)3A B+1s53/IYo[Ldd2$aٰVyM+={/C *!6cZMǹ, r ', AgPĸ3y^\]KSo<&sC/QfқavG71jZQ@<-] yGH 'o"ZFTlww$%xq`r'˞7m_[`mHhoUHyDX"i\cZ#6ˋIu]shʪAqVcי<}//e0V j/(4K&hT%Qu0`i(-9<ҒBNDFucwI0),$")%DeAR~s5 WlJb/f#nÚ鯛wܦ)SL⏞ܡibZm͟6Jjw2Jҋ)YRwVj8❕J?`N}nS~vKyEcHC$8pA?.VE#+Xf?`NqlmWWݮ>CEAT*.RB NklɌC;ʢ!E4UT*@2"`. ) $)R9jTO%%a p=;R{#@GevS:,8t8?W ̏uNuwq\&:{_n6})i|,K- cu BGolN6̯вĪ d4£h+6àA1xpRDc(X|ZjlJ E9Q'akZbSBH2EHlpd-0qA]6H9EPţQz#gb?{ƑB̮[`,v3l6ښ(G#=IJdl)V8E6:Uu.!_&~jOh%į+}^,~WnӚ-i]p`.\9X8tB iDQF. 4M8++gxˉ$  sس,s٨̳%)c;r8Oa68&Mtl95y#l8 JG%J*^V&‚L47ũ+FoEDQ(cOC`!7;;CPRT\Jlb̈́C auㅏԊ7:iZ'=*]sDH0td8aDh_\i "y"PQQX}bˣEю4/̛\6Dc\(d>kJj$psƮt4J$hoG$SOȗ7ZvZ MrO|Ë-/$K@FDςW8YoxcwjmGytc5u}𒮿KqZސ Clr gGGã;,\vgWoֽ1+RLerBu3v%B\QFӷGW{e-([j9 Wm O%>dnI*)cMR:X*hqJ[$$ǵOMtS2XU ƅDNi!%9Xr6}GӚ6[ 60 DPCKem\d8VbYY`토eu`-s?oR,@ϩ$;i!-M8r:B Ŏ]v49CH8 KDbξрKZ\&-vL=hǡ47jr#}3ߋ/L]3:X,M8Eg1*J0d%xקDxw'G h~}e{fژ;ƜႩAh:;LJ aE%1NR:o%#62%9U6j#Hیy Xxܲd "S <0ww^fEūQEi]I@GrIh(+V,N!+4;uQE]\.QϝJIaRsuuok~Ltsob.en>9B;>)<"iW߾%o_8×#vyy'B ?c'%ZgvYz'w\5U>1%g@ΝEtn׽z0 7@}i..}hv#!=4ޝF9{6,*K7.ϼˆZ;'^]\{`O?)o3))ͻoNfg/o[_]~B._ſMIPB* HW#9PWhW(v\Jք#ĕ3NXPfpj T 1Jnd\W80ɵsq\ Sص\mzNdC"`ɵ\CmRL:B\ iV3W=Ф6/rן_?}rڟS͘>?3g슊Bҥ 96gxSN9Js Foc|^kw_o/ vr~^"?q[TJ]]/v#/Vҫ=Rz:/.oP9-\5fPJE\}b{'\ZS7qhC;ɵlKPbWq{+7Kr5Tv'\Zu"rڻz9R6:aL uaa3$W{S8upJMڶqL7+M3" k-τgBWy>HQ\\#[v\Jl1J\Qb8ejìdn}lԀ# C=9 _qFONd0 \a2t;9J's(WtSnAvyS5+Sew[]%+=2WS."Qn}%J؆a m5\{&:#j7%7lZ1uFIjW@"0q5+ Y#ʕR+TV1eqEre3">%WL1 8 }X"F+Rkw#+WVrZ 6\vڭ{O(;H WG+!\9brZAT)+aVwd$؈CG9L9W0v=WzնM5. P>u5HaP;HpuBZl[VWw N?J Pzm€F*I5`: Q-.k4-qȄ4fX}zRzh)zRSJ15g6g4ތGrAbZgj7P%?e$8SD3fpj Ri$pu֐5n`-9E`9PQ p,2Ӓ B[X:kW$XfpEr+RqE*݄cĕrAK֕BkC3"´+RD"+넫vV2Pq\\hWvzvtyMz\F+ ͸Zj WLj+hWTLJ+kY ZY"+Sc ]K V5He jVuE*pbp6=0?|01Ժ3VY  Lڶ9VplW$W5+R v\Jk'\!W(+ HVpEjWdWұBt3|&ϰbrRt+M`SI0uޕFРn +[:S7R &4d`eL3F/= ڍ^R)-c4z;v >kpjW+T+R#!\` XWVW`K*M:B\ɔp v+kWVTo]*Oz\\pEZ vH U f&\\ ޞhm9ǥ q 2)!Vj8f _P{NS+΄ӹ^P+5m(0Ynv$X<6d\p5{u02E;1m`ٻ6W#bun d@AkahQ+Rq9EҔv[3,ǩ[] ՀkPzCBW ]yoX듡7@^(- ]}t>S:G`B`'u?+/֝ V(tCڕ x;tlotj5Z & M-hL{!(_gFP M{~ Fm5"krw2[&m0d7лx:~j]6UiD>@z=m._qWilWޙ[~kqYu?vJoN]ƙфl`>C|lO!j0S6G^[HoFj~6w?Gq[:Q GGy"An3p@8|̔w ;3k46G^@=^vގ7ˈ mڲ7r%`7u*irlIs$c &KUgr93*7QqJQ~yosn4>c{UUvuD30PEm?鞨(=pW9&kt8.[JëA6jTm [Q甙B3 ]HRUmjSUjt7ذwK?QlB#-wB:D!;z]D:]ضT7YR!]AK{qA1#3X2d]șcq|}asysYUȫ;;]SI{k <RIIT1ڪTw$#@1vnE"N Т-)$qu$~Zc曄E)>9* X )$Z:V(͈X8pj6z.WP< Yb1Y\S>Em5R)0f;DrNU.Y{u=5B }m.5SGގ`ƤUȗBi`ES }FnQxW 5h NAkah:K+MVjY @rܪCZ]] bP XBs+]S`1wuV6 ]˘ Ͳƺzֽ?s%ps lZ(aGR7fa=g5 C*ZD@ %X_;wMBAqU&t)S`q\b ~N`K(J J+.@:0 Ns[29G!*(5 7EP TBM'$Idd0*zbTR uWh%W e2a̷6QҜ` ePDdBEA31ۢU<B:nY{$$XYu0wT \|})q$8TR 3ˁ Ukq(9oL&vr/ōbUڛk((Sѝ*E4GRFy֞e5V?DD)J6􅲎_Qzm쮺r{&FgU]%ѧVK1=#̼BjƐ_Μ7@eR"ZJ( ev1!+auh#ǻ =sA །wA/wh8uHNoT U86ʐvND( /;v&Xfۮ2]/7^W f=f]&>MG1*4ʳPtdW%S0+:XF SQd4XQ|`QBgąs՜ HDN̫ fʇ=kn 32]c4/= KH`Ge Y%cԭ)x $nGfp6OEUgUS Y_"bΪڃ d-Qةi:B%njׇ-|0di*YrS,1gmG]I0SQA~,Ck5K q[O -ChgSRC;V@@ z-bB bjw ,U3ڄ`1bLGВ-j1.Hd\!ոn4XdЙ$ fjbRE ٕ P?AjDPqY]Qõ`Qy0 !SWI|j8ˬB A1+'JZb~C'ҸB:kϢ;4IyTl4P47uTʵr';nU{9O7k 2$`A}+f%7lmՎ6< cs}lΗgEZ-wi7v\7*d/A[.n[m#FOʔ6]7[`{O(8vR%[Y:Zm5=IP'BCkBm1Fv(gNoUfĞt5n(!/[tؚb樇 tyB1C[us"ڛY'%\IbLvЌ2 R.1#K[P Xoެ7a; V XW4#RNnB1Eʟ"oQ^1 bp GYTPcD4wv:yT,~ xCڨKecT-x4ΘJi]K N[o_^>92E/mq.85j,W}<{by^Fo17al?Gz^nΞ?8濌hX_o#Gڮ6]8y٪xv $vx۶]Isq֋bTc+M>_Rɟ?}yUdWǓ1\}+Z Wߢ ʝp%+1\J WbÕp%+1\J WbÕp%+1\J WbÕp%+1\J WbÕp%+1\J WbÕp%+1\J Wbկj {Jt Wd W@7\yUշhXJ WbÕp%+1\J WbÕp%+1\J WbÕp%+1\J WbÕp%+1\J WbÕp%+1\J WbÕp%+1\+d!:oK71\R67h".bÕp%+1\J WbÕp%+1\J WbÕp%+1\J WbÕp%+1\J WbÕp%+1\J WbÕp%+1\J Wtq 6?ՀOp5 W@ɉ[4\JJ WbÕp%+1\J WbÕp%+1\J WbÕp%+1\J WbÕp%+1\J WbÕp%+1\J WbÕp%+1\k0`œcEhn{?@Kku7,/XEaViNڞ? h>zP(okԓ ᓡҩ@ӕޭS7z=_q7j7 jy/xy}QxcM{rS{^!! fYbz}%-~^_/P@9U+?.7Җv1x\t1&&EGEm.6?_^A Cry߿{-c[[ʒe V- SW|p2_0A[*9/nzFe%l!p.2M'$%"̸(I߷(4Ijvt2t5z}*t5ІG襁2ʥo \u/Ϻ7]C(/2qwnn[Q}0^({f͙2Vx=]\M+zs9c4yA̸3޵rk׿bC ܱ͗Mn-ܤ_|GIywS-ٖ=fAgŵSAPW\MUYJ.hݩ J#z>BV ]`m3Bp ]V Ja!*mw\*h%c+]/1:.֏O~teu0DBu ZтtЕ`ϫzN.a#;{Ht#ZBW;*tT=V]Vtb*h J!z:B*Is$Q>}:e0Pf1?^S0nkҬ- %}+"_`PuU+ V^z[e(i"NUljfZzْK*!YUYsU6q9@ԯ #d~QHTG( K>,V;tU ]hD!"}mLW誠!((JJkQr+N8ݣΥ>-<5SX+4+-c;C]iBz'41t CtUR]Z* JT=]!]iLꎺ"Zt ZzuUPui3tUZ"x%;CWUO@P CO1aSV(w7ecW|=]9Cce[ڮg{tUP  y+`znhtUP 1ҕm٠e#ڽAcro Zv2S@(/7e KY`ctz/4C Ԥ5FWv.'^B-gV{0z#޶2U2_Zw[ؾK+sݝͨ*gp8?i08~9_ g5Xu˗.`yxsW+tVHB/3&N%/=ǜ=|]%>$T@@( U]'w' NΛ.\ЗDƓy&T//U U![0 C/' wCY/~sY+:3S֏Fz:B\1ӥh̡;nw誠tUPւi{:"-''\3u.)rsr[q#5v z̐Mt+LWhtAIc23] *;AJpu&+(c+B.`ݡֶ>PZ#]YPЂ`+pMgfD ZRB)z Yp93^#Zs gp'([6# ;tҪ/;0Bg5+B\ J=]!]mN+[\6EWUAkdPJz:F\ v*ؙ ~JZ_͹vY%kC@LFpYpa%#䌖TZw}K %a5Z{+,\,͙#}>?r=)?b.|"a$ӄjl >H ǹ:^_8] ?n8Kv ^F~!1/=4ܓ5oW~RjurtۄWxbXy=sí9հXXpƆo$pٗ~ճɦV8an6zv_J LHib9OYE_T.Q?&IAJH}Oۈ1QΙK"+)5Z*̏?N+]3'?r0CX< {`4R#l~xt=n_/R[ awf<O>]Ɍ+u'$Gz4"SxuߣQKT\~Z\ 8÷ǦvL= fZV[7eں%^?!g7;Mfb<=?&‡tp^?]~~LhEg]ElYWk_VM6&=?4jbTǾ~ d$Ղ|7E)%H9ξpgI#O~&iMͮgg)(ZzgpZM Jϒ jUtV~If'sջiĠ2}rVVTr}ozp&*lfO߀KYHWO *h+Y{\ĽEʯDl{fJ2SKs&jL tg/PI)a$xfpQ(.Fg{MlAYպy⚔ 1NE(FHR\fkOR+ ;|:0{n!N$G=M6ZyO˰Zfu=)3h1#YФf@'R QqQDa^YbDd^|LSl}|m]H.ՠHjWSU6%dㅀ  X"&]J; 3ٲO1}mުo-8lKp!z2hzYSuwg73Y hʿS&t6!ǦMp׋܄+cd b w|0x1׽+y2iWc%s=&\]bܵCaه"<8s:r0Q8tف˨DV1q Qh1Hg<`'@*sBFGС>0:e%T6,P ֻP]^-5Q=wAQhlvDvHp \SիWa]=rrtOM܍?;g[E캀Y׺j-ioWWC7JFԕB aڀU 欫̑ Y! sZ#Q{有6 Y/b{/IJ!8:&@G,i</9,p4:$!KIDDb66*d4510A̘䲈 `8HDHB:mX6i!{/1zkN_4yqpFrߌaZ6޳u+nڂ' ((ePZ[dr&'KaP Zdn\ђM;O[42mR]))V}>[$8{|n&m2ZtE]鿭.2wQ|W!l}W҆|gmjO9sjs,bm18 |(AҞ#̣QTg4AĩdeLPh\$iJjP1?RHlkJV!*]M}Ef\;H9рO6v|qFURᱷM46U6 R*!aCF1(T4Y8J}kQP™ڪvԫmjL_<2^hB*-qoɼyv@%Ǡì%B;se1"OU^D#x>ay d!?9&’HheJӊ`GX@9a !Blb,|W%3QĀ!-;o|ãy~Œ@K+pOq<ܣܺfg@[G&M2> hevc4Z_ f|#Y\\QE(j0XU1B|均eT$z$Ro%[K"fa @Ae@oS QdBQEY,"(%J"!%QR0Ke)}bY!@J{n1!4ņyZg6I"2IHe^=ffƚ\|$|@*hs'KIVWpAj;t2yx9i"N,xsUֲ+,L@c*^)9"kw$%Gd-JZS F2JXSZ4p@k7!z:Sʍ*>LrʬZJ>b rmZ r׳5lѼt*]BL&[ $i[{ K}@gZLN*A"3vQe;\YK  k U58[F x<ۢV:۾:`eHn)N#~EYƳDƔ}|YrqиaF-(ӀC Ee8W֢\PwCyTs(uJ ȉQI%ZiN/K`sdA*]ܤdÌ7ܒ bYVg.ˤ+A5,GLQnS"~[K`|V?Gі` 1$9JiIxjP Ѐ !g}޳q\WҦJ~0.\ǩq~h akT>HJ⒔4(g &3gg{΃Kq _X,vGE38@$PNbXXHRNlRZ&L|~XPg}u}hލֵު)sqYhivM/ǃ.VX׷prV'+}oO\2Sr-jw{{f;$%# iG'V#W"Id4xO؁tG&o*P2zEG XpiQ[ V12#XP1"&3fgTlJ6^ 3 qf^{^W^(=l4jʝ=jz=uͩͯf8|/̱t‚%2  ̀0Wt>b4BRJ$jo\D-sS42؆dc%,$M*Nfh/,.D ێG~< ;ʵIfMzkPo7G )>"~BT)ـ#e凑VqB3 x rGQC M3ll燕QvHP8b6x(G9"9b7Zn`} Ȃ )u L5 s/\DA:-93.P4lqnZad`(֠~IYcA%\L!= %wɨll爟j |qZ6uHlXP3EŞ/n|`Y=vb,UTTp>XιPu;_Ogᢒ[9A.DG?>Q#VT_HdԖ]_|Tm,YKF*2hGBH:;jBz&B$Jҝ/攏c{fU@ 1V* }/Shs"9w39{Sj0,$"S`H 81Q[KUT0)tܥؾ]xVG*#R"%`HEdU)Jaf@4kƝV2գ 8 (O$}.8= ga.;Z?5;D%X,d0W( JPa"IUPDs{^u Q\[]eD L*3Sc@]HF0уނ'"kDhHfwz(gpa jv(v8SZd9urX>&k>ҸI0+UYYcgɫw"ǰWxqӕt1]Oh8I]2'ů?/ÛƏFNyBe pa烡XPcF:ۙ&45cGX!2`)"GL J1uGf: vx}48 C]6H "8lBR(tH\ymNz hX|Xc/ Ч;u1\b"Ո굽lG=99]\(ZbC3RɊ#(XΣ1|$J-m39q"J&K5*=ōgjFZ«bp/08{KroW#h[l0^2Vs%ׯtѵ \}6.,h}+;v1Ihxntb0pU ڼj]v=+d)N=4>ߏ'.M=b6.Ƶc_ܙTaR8z\9㟀y7NïoO_y:}ק߾'؁Yp @ (c{K#ڥb;,pΧn6|ܱdĕ1;p [_wi=_N(h~D8H_вѽv 4Pb#Y%،QrN{" /3Ebi!yta6<l8&9 MsutN[>aMy1"{l[u6K 76 0)/MR6OfGu1xat Oza;G-.lIͻYM#Kٖ}M{g7f^Rwګ#eϼ T;y6oinl'NJrzCP5o|~ZS%+OoܾU*˭͟6WJ6_VVW#'/2XB(:)5ոd%1RRqJ7bi1g;Ja81dg;cYo{=ݮJɻ݊tgxO(+6y싻9AmQ?\٪v%gx^kHD+bM`PU*z]C_X ,W^6PMoQ\7p FʹN hJIӎQ}Wy~⼆*5aO 2YyM`pvڬrmxTŮo&?ml y `I~H&?Pj\^in宮c7v|LCaE6<;MAc'+śm5##*3ԲebІTU>?Q$U"kH& ST}0eBaZ{g*׾pC-h65Lu6ߍع_v-yDvmj R%j҄mhi+3_mx]]_\;p:7ox{goؘ)o{1)WCo9geO"dppvt3p 5Y5m*H{C.b2wi$ŴF&c)Csc~O;Žq'dŀO@Xzdk%go<ˤXlK`+ց11(^wXPF1))i3p=+Y__$P Gs=5\F|8 4jG`yʄh3#3,L/agAȆ쭱ZF3߃Fe W=[R}O3(Z+ЧW'38Xc, n[u&DGFB!Q2ŀ$A{A'*ߝty ? 7׃`]yNu?,W~%-}DQS so= 0M)'/Cdr\6'ͺJKn(6Va*qtbYq$BȘTq@̆Ͽ [иhݽ4F'@4'3YT;J+5$"!j%#<ޢZ 8LT*ޢاhӠ(&95S1H"ɹ<E| ƝJ0@BJAY !1<xBR \ ;v.p+t%rXP _ ﰥ~r+3 [SX%6TgMOڃ\?Q+ݸ*:E!RF1*K6k.yT{+ b zgH<x~T;ۗI6Fr^==2cԒcgrDJgNv8~{կj=woEŃ{?;*'zNZ{DުRc?6I3Zr#SYeY"- b{3*/B;ѐv0Qp&c:QasleS2Vӌ-N&gK? ga|i݆!)ecBX#`20`"`(mb&&{=_z N!`ބa6[Vb+&o:XWrI; 6u '5SjU4#"K9eޗ9P{GS&.ۄoL+u`.騃֮L)+2*J4!%4R B&:{FDMH-NJwf"u>س󭗺lOȒa/Η^1y  "Z5``6҃1p{؟{l]YmC;1^ͩa -E}fkXRd _lr& >ki) @b.d_=CQQLL/Eaqqٕ(rKQۘZY)|V>E &x %m)bQ7*iROu7 1ܧą`_ҺޖR*H$&*:alC6Qǡ$/c&H3meUhqCnܫZN~&tIO:e؇(my%=ڠeedor:H]$)g)!%_*a5n viY*c- &q.8(Ur2Ky|f=^2"gJi!UɱP+mXpV*BJ>J Sه (g]Cw9qvG'f ]8FN~`@s [G|M/_g`x{{෮~°p9F<W/*|ct1}Mew. MZ&}Nj-wv}n*LN ZYїF9ǿWrloh5ԅsL2ٗ_af|Sۏu|hɋ:tLUY\&%|G'?<,w^ G܌=7rOOh^zM!gU^[{Gi~P}L/Y5a΀%/EB pUUPZKWmy@ _'Zo1q-YUNΊ]ס nO"MAxMIqˢ6HA4y>]d*rƝ+d44Jq]:h-`dR%v*NA,;kmDYWm>?o>v ]|BmrbH?/ɽԑ3ufIs崔۽d4kGt̆f)y𩧭zKEqIuYA4ޔL19ÿ?`Fѭ} +8 .Dq`߾u5'3 @R3@re$:T FHьpҪpAlpErru\Jcz\!Bp%W$W\pEjAwWR,Lj+4B6#\\aru\JwU/O2㪝`Vr ?pgZ{=᪝J •l+U5HvZ (ϡݵSٵEw=WBK2#\++WVTqu2\^- VW`', ~y5'Xj5Ng+kl؟M8O›m|F@b1heYT"$3 `*9^Tɶ[T6^RKT!*V-up{1!e`x6UIYS?FP|$W\pEj :jN?ntl/pz~]*-3~uArR(X I&$RGim6#\`)%*\Z˻+T Wǃ+cVH=+CJ H]'lW8j} 6*TWLj+gi^5#\`'gԪTj z8a8|g`{Vr8o;@[9۩퍂Wznq*#\`{r+X.BwWRWG+a8F+l8K}t"gFs/ݾf,5WBJY.U!sA.fj'3%twϔPNgc@hSjQn4avjƮZc 8T \WzΙs6#\qT| t+k \ڻN;8e{\!+W$W\pEj:H=Wkr9]&ݙZԖi4Ϥws@KJvt}%w#+2 Pp$Uj?ஹa"U6\Z#+RiY#ĕuࠒ692\C'{lVuWRCkG+-yc[\FV=| e| ſ7jrk!a g–l&lQ&GLji#h^F2m2B6Hr8Riz\#p4@W$9y[%Rcĕg'E^tBF E#CjާpN2wƕX-/zk=5|dUhګot ƃ׈~kOw z'0C^e`?^.gil),dx:7+g8ϑ)R$' QAncN5g$`T[ץֲ'S Wkw*NAK׳qgxAFO(9YG6$1n3x;Vm|W8OH1dlɎL.Wzp}Ն^̱tr|?OI?)iKOFr!`kB=`MoYIe:w` U;JZN%tl#n+U%F&#\qW$W\pEjuWRWG+ HJd+R "=WF ֮S֪`qHp [ׅF"~aFgeex_˹=$has QMPHrZ ] Q%>(<ƠY %e \)r(&egHgWJ r!PJ2g"\fJ)`QJC3stv~!F¢=ƙ˭a2N0 ̌\ LZ\R bh W(|pEr!\R;+Riu#ĕRkPQّ\29TkyqE*e >F\9ͽW(2 H.׹:HgD̖UoN';V7!{`;@SS:vJL;e'O#'~6'kCxxxN4Oo^Og^RmUDg I/eg'Lfzu>u#0şr*v=jZBuRɔ Ψdkn0(d[lUZWUiѿߏOȿ.@~>\XW/(r&%,\4IZum \=~4|/vqF:` nj>_Lߥ}WUkAv㴮 ˚ڝG.,HƯWopXl餳҇9Cfilh B"Yʫᯯ^GxqOT_?菪tׅL`!W1jo\>I6hyԂMDG}}ru~hѧFȍ&Ơ]U|FmX7?N_73_O|@uuDuuݕ*q;zu:_UϏV~Pj@SǠ5ܔF ޙXʊ@# ^y\e\*"K 哂`ވ_ m65Mfy*bzw: ;8a:]/F##f Rt,Vh+.i 2q&ܧb=2 uIoƗlm~|I bOڊ(/_L0/~.ʗg4.pA@)螆}Iu9.׃_~TME㦏?txbs+wk{…>|]R/ճ1`{[̗k b +(I64׎WVitw8 vy.w%[I[lb݉h^nWFxq,ɣm!o~:S&B~[5?nVpė['haiLCROY J.zݸF +=:1@R,s4Y. ÿ4 {]^-kkcZ(/rR5~վ(n߶o;|})_e79EqNS9C;H<9wa߂#H丵5ti46yr&F&NT4 FS87Ff>Aʲ5ZF܊TT8!_ x*JA VdRA"* 茒sRKR9^(f^$*s.9JC'x1y^2 n@}T46|*B8ngT7U}4:=ds1bCw_zrzP?//uQi-7XZg_( -{h,Y8=5 $rYfƜ,8 Z+EJz 6Jū~cWrꚆk[IYxSav0R-ÖcV ,u.' gP1P TG7b3/_:{[\9Eu춤J.} +ơ‹?kpfL[w@Bz~']7,6 9%@рI"zŗQ\q)TSiTĚq qZ. ߍ/͔X JXcDj|+EBC;6VU% 5E 1eDTg|Eb|ToSr (+T6Lz[y;6P|,1;w`P) `SJgZG&h)Ey s)Ey,E)ZI`!{WHl EC%G+)=b\j>qcillΙ.>8\2yH 4'0ƧĹgq7g߈9ܔq$J4O,!5-7PEij r}ݯd05WN`p㔱F9fmdV͹Z d8@ox] cP.HiDf#sߒYzQjFFb,Oa,gRO|Nfa/m4^+ƪjń̂Tc*f\jK g|R >u P;;Kvrmnp-=$^(19>^_HgGӀ$zKnOcge?>뿝4=V&ٹgҌN3Bݎ7ю7f|\NNRjl6SK$NT&5VRXCN1RM T 66E :S-YJw#JK1`P7LAz隇QpP>XI=뮞U5n[N'=KV,=o?pU?g1t{ak |)/ھ~*p0g}-dU<Y~d~kF(F|5Jɑl^ro>.Ҕ w!@CȡI, \pTQy@7?_uuқ|>^:}{SKn !eoawȆVf>T.!O= XԦ[ hl 1'x/-G%߯ŭO2WɈe+ҥ#^7TЛ͹qWu,o\>#^Ż'_dո,jO4/\nz`svtְI׾(zr_Â=bӵC(b{[4BU;/ei%dPK9'biC&'L7 { *}CУ }gOmjKu%%%FO԰> ˡjDͺmR+3$i-;١ENJ1P\inջVsĹ;>df)=]Lծ?rN_9|^UX(t?\CJK]!E1A."퀱 4egr^_v_͚`i}bqНDݦ;o^0߬]wou^L *cq4[kIy/n۔ zE՟l0"nvM`FM`-wo)q}3䐍,F0UMQjÞI| ǎ!䓣Xp, }Bi, $!ǹOCqv0q&ӓ:&7P 2vk}=J  mνS\϶7`N}!b)fej?Q K &} L.B.ЮB1$a) 6oi~pURيO6UL )S۪IQv6zsz l `Ӏ%j3Sk%8 q`ek|ӏ<}Om/(f>?.{M0ف-ɠeeV/C'Hn^u` .u -> m’`DD [c-ݼ^SEL kU΁BfLhbIIQipEg9h<:S R|V Ѡ[l$ V3=W-GF6)'/ipV`8/8_3V-ړ\>.}=5h}a#ΨwvRqAG/ OOϻSQ8:@"N@8)CriT=D!Q|6Hd@;Ks@hMF=EjgB&k)6B"Dp^hNҭ7>T3Oheivcyug{BA}^D-ߙW{y o >Q'V@6 # 4,0艚#5G43jhfS|j`B3RHrִRU\D)B2AJq)32~ɜ~^IMm}ON:aAJ^H f@+B1E!)H M7qsS42X0P` Y$K "#a:0- 9s늜ێG~I;ڤc&=j"#[| Q@C€8¢LGAEUz#q6 fa)U3АA2 hG9AQN_<@t}'9waV w;㏇""i="n@S3j{̽p\s符θ@@Pbs #$$8pF$1@ ",`brHH(]ꌜYWm\dKc\d=.Q7I0Np,btpާoC΢豋`mc)ӤBl3TO;9wv{^S@|Fi ܋k}v\ԉ ~|&G-a{T.Bi؛r\\PE$*)E|"4Z)'y:*aU,%#B 2hGBH8l<y08zXLyS[dNA sG1 \R=3ZH * AY+z'iިǧk^J{Sj0,P"SD'&jk \x@6pN8֑ HitL*"sb;Y3)}vF$wgR pFv47ͺ G `F"S+CMRk"IQPDs{C^y Q\[]aD `8LqJGd6쥔Fv#=iڼG1| )a-5@4ܸd;܃Rn}–<"贰ƽO|jb\6̊fؙa8CXK|8Uu֔Şp$igfMEcvNF #<@0. ^ ~PĎ Р2|Jv(~F"r0ARP;0Uꠚ.gH"8lB"tHp;Q2'_ I~$qP?6> }᪘ "E~mO霏g瓕 %f8MQ2Zm7PT+54Ή7=YQincʜE<lGm~`v 1 sn>Fq88=ז.Gx,T 7_abI#1~H!kafYa4+V1]j1|009FQ O] 8 6.=Gѩ"hCx◺~\ OG'o^M''_yw:y'~ 'Ο`fpM?E{?l?6y0^;4Ulr͂=&)3fY7oU]u^xqjFY"?0#Xk4ʏ^.|xLJJ-Ѣǒ M1Y|L?}JU|ΊQ>v8*~:^mBB0_'te^ۤY(>n|1m( T|8 bNrPE(~Я{e~40Z#$PvE\7dI, f+кmH3N;%osZvGvRöIbT3 Vc\VŘcl:vQG[{mI %ȗZrV:<{YW`"XՋUwUw_|rTEQk^z&Jp')N/?N;aK+,K^rFJP-gҒGwEb O+^=IܾLZ'"pHXI,H$Tr*Z+DIr9}&+XUɅ-)o[Ҁ%Sؤ^Ż,xF@yƓ)lqOqӳ0)=a:o$9N'4cd,! &ZM@eIlC=_(*n:U Wh.W,X&l*}(7nJV%z/5 }^n33 5Og0 IN5q&cRf9jӈ} Qꤜ  .(~@Waʐ_/f+j lu|p!s?ݧ' );G-.ւȍUc Gzm:^IDC KhZur;:#z`7fZвM}Fϋwwޤ٢祖!գ]_tbVVg7T0Yo띳f;÷w m62݈M~خ֗2^gj*Ǽ;z۠dJM5.,t8G =*hrl-[+mk"wM{[ U@ao%s GN)'XP"Pk#3ͩE1ҩ0bQ # be}0z)#"bI5+CʘtǖR8ƕU>v5քA86K8N֦xeuh9F A7j1G!>م|J KpYLI,zDxDHܰq =0<0( )*&J$$4"*XSi%T8 yS|Spi/%DpC`Y,-,xGI4 G:ȹFnGzS^zHlewS̕&5\3e>s!V(S;C1K`qBBDE, CQF,&Q{xٙg-BR +)œ`LZTS2O93>FZyA$7L|zۿqȱ^IH=5 a"ˉQ8MCP :h$ldykGC"^BQךM`=K]xM]sW Kϵ0DJ@@(UQɌթ(x12C<;{OHWLkDNy͎7bmхŢ<"Ӭ\{rg_QJu y@L B*8Va%Qh"qJZm{J'j< uXW_vY -Q#% LHb0`u(ߴ']F@-#BH79_<Uaŋ]A}GXKO?x~Xs&^߼5ջK1~գVu'%1l?W8rzh}QR>J EL+Wș=8~_lBd _$Amd$g7wWt5x}S/|Y^{xD TƆc>WY(Ώ'? vEO ~369h\񻢶T\L9VJ)v&"AXښ7MǗTz(|*U"W}+㝇D%=\}pE`\%J\]D%=\}p%1ct1 kǬڋʌ+ΌRe\jAԮ̽C{' ~xp[Ƣc\8}qB?Ķ,`dܘXڠ@I٤" ʞLEu5r Cˢz?LS|l){C&L]7^X2tie$XLjA=-˥ؿGmKʶovʰ#zB} ZF @ _P0| F\%rJ*prCaWK ?vW\Zwp%•-FS.U4:=*^/d_z_+?/e1Mqi~KK L<4'T/~yc3P҆4O7nhLa~Q/RW GW׫Q'˅N)yT=@/3eKcO{ȥ?}ۯ_a >d<ڪH(kTXIM O13qE]kor+d{MyH|^cAHь俧ӣwόGղۀ ũfO R ii))[IdX/f2PA9xC{6 ̞]UoŨZ1=J$q϶RO6a?bO<2SN1ux颰Y<234xdc/r,yU$t49"CrZSɞ+.13,t@̢, UTo% 6gI'V(ﵴXQ_Qc=.en/kⷩiNzϘ1S24v;5ONe?Sf]#VDZ5e)kƄR PH8WrVA""eӲޕKJV b#MKHP%\}TkLVif cmll l _MGUE#M6miWp85yotj[lth{RhNf4^h#1bRJ8H<4h1J26ʫ6Tg El*rduvc- olbebNj7[Zmhla3SyhLq.NW)&_+tJyvr0R7X1y#5{Ȍ f/95Z#(~# d\l}&Lw=aLVX[D5["쑑'DY'؊7{i"!dALAaD A'>^<ih%VIv+K3)p,YX`RSzNjA[f췈OubڲL)lkMcg8ŝQXH[ ~IB[-0%#/mMB{H:-v%d,:u8I`$0$A$x,dJҪ},t ŠBDW *m9Q=GQ*S.()t&^|;D [ѴCUMЌ樂~tEPǗݩ8"_i懰:D>ǿncT0!Nc.&OB%+BbP+JV(;vڦ P9$J;~jZ%POJ1edANc3q} AfM 4J?KJb1m!$)* {>fYObƁ 6P~ۚ1-ڮXoE&V@t yp~K$ƒww@/!K6UiIy^ c{ױ<Nζx)#" -@}VF[ҬcұKz`>Umw] ; 8jXpG0hR8|pzFg|).`gui T.Һt!zi-L.Ym7䔄(xa;|ѳbϾJ $g$2Z$SaBYv1VXLжPRЌ $dE%r}V5oѦtSVGV Ҕz@ 4xuW(Hk1/OqUSiv; -n3:zt`b]Ϫ +ϖ${U.(o0C@" ڐ~5H$4h=)H/052dҩh#gRjcn 12QŘTMq(Zڢ d{RruBh]J$:2#mv^xyޟc~KWBAOo3y%CtoU[;u*.pnezWvtSHDs\{w<0K1j1Vz|Rw59}IRSң)̒frCv]]7Rd?zϬ7q~qʛNmWv{|U: ]ݜdxH]|[H] TE ɺ}yyί,)1k3^|Emt]_'b5V+=suw+Iد v6+N8 T"u߲|l'>f}??#. f7. ; O2Evn!k\A|=Vϧղmxy?[lڞ݈⿥{>KllF飢$98Y FqbL;;ҔxlOvF ÀM:vn~iOhMO;~?9d̬.V~*d=7Ń:.@6)QYKI;[u-x6 axU>}H8jY7Ԃ-OIdbAZcmctQs֎2pbQ`Mֻ$]FSF%jum3q(+J9i nX{zvqjrumh $5IbTRDA /ҩ䄱Pըʙ"@lxIMϨ-7 I{m{K2H%8S3?det̥U\pq/cَ8$1񏣹Kqu` <87,yWKXW/,Z .=IG7Yy+.X߳N "b1Gd wnsӏvoJ0TҊ'D$VF7RqJ̎KEEe< >yf0j L܎ƚ\_+q=Xx"SRI٨+BQ@d/1F>Lj^dHkoMuzm? eg 3v\ॎ5m0xW)EIRdE L~p}>n8/#-iQzĦ浕pF( F' |`T&QP[Hb1y)G 7!/HXMȲE*Fkʱl M(qιY*1?dӖ(*V8ґ2Z(> /, $2-?-3Xkh&UoED?z͟8KY#ٻ6dWKFKMqu|:}9W+Iɱ8Tϐ)st!P$v$rȮUǸǟSVWt?a>I;Ü : w,f~[Yꄋ#`\YyhqG.;>inGT!2QB9aGTI)Gv: Q<7$rUN(O5+9MIYůK[+z^[nqyrr|uq@\ ?)rM h$~s#qr+}ʜfOdzk ]Thy[l 1^sa G|xY[nnv92G^g%>:}X8z DH ֏t6 k(ifYŜd9iy7w>?~x}WxW*z<Htk|7C/ɖCx󡹆 iYq Sn;3F/Z8?˫Qy893<XG/ Ax( {,&:T"kN4.ymI 3AQ"OPƟU"jy`6ҏv6U^UKϷ.:l#gVBRh3&%r#D <&p4HI#c:Eb:J2*&1^6Ɉm/Mdi;ݝsk9mg vPcMۙoJ.O|܊|u/9WՆHO$с>ipT!PNV؀j-֡E8?}s5%p"|= E ~A߶st69jA4unqww:5=*?O d/ۂZo-jA[#öGZAZ(J5&D Cu5L3ʩcW`R,R!yLKw@wAt]\$(<jJ;,e@D#,d)X\AN.e 빱1$e3M*$F}.Z{$t*kbky,?͝DލsZ7K8/u2yڍ2hdu󍚿;hʀg'i|Op~4|<LEHȍM03zo=׌qg&oJ0y\(Kr 4 &T:Ɉ`:e9,N,hEd9o $͢f= k&YB Zt18+fME}7]ԳN}r۫g џ^^z:&)+(S&/8eԆ CԁcTH* M&q-22 `T}18 'ͷ 5G$Ðx|lB#gPWObd" e }s9ҡ-G% pr!  Jp "W2@KJGƈ(e24 %FI,I9E Zw)P4E6R%܎ebkpfl&ڃy k )(II;N((%!HZKr*9UԺb[!tIzi YJVD<P˰N(ɩ9,MLmHQ9$Yg)ӎIEXfJ¥,qMt1*-s b /,gyu;wV > (cZ@yH.ieho5M$xnk^OgzzuF$*vjw"2qqaBݯ`k[aZ꼷z;Vv3^*I p .9Y='bZMFJzQDqt])x"!2~1MJOLAz)KdmJZLj䄗Q mTan=cM <w8 RdN2NB$ m%΂R\Q#5ţx,XǗ_"8dSCdO&J1:ž'mo @< PUQ;Q+קBr>+(,RQPڈ RhάT sjo;ub!!JB0 BŧϻBmӲA*}P5QQi;^$Bh"4K& 46pMІp:Q:jW%- H1r+ }Ȏ SեѦ"8qФ%$*CWcLP%T-e2M#M۱2HY!$"hOD FsKH231@ ˵+4`x'1HgOʍ͐pwgzܞs+sϣjl_-J JJ_}.:V\ 6&V)`/U '>!NHu I:$*#K8CBo| \YUi4ELnmخP{S]\ۋvTMȋrS ,U'ygbŅ@0hzZ+GRݑ jP{eB4ed춧ծC_ڙu]νZlޮ;2e#vzpV49/kŖ`s$)HLJa Oa"mr1^gա$S1j5]i &̞L<(Y{]<§8ep/UMJ <S3yZ)VBS9L`Ozb{"u[p;!p%(at\`yE3BDN @d*7F+_kﶇϤb˖}-dwl'|a'[&XʃɖUPzej zOL#'3Ɇ/ZHLWΐ㵯V'>iLxBX+W3[IJŸ7H1H2+7 m2V [lKNh.zgڨdK*"'Ҙ "IS'f<Lhg,MFU{?:V4egďxdK&d& \ˀ̐ 6FOs$"$e+?_OAvfSCZqٍZ/bWBokmٓ;(ªM2Q-W4lx[}oH%L!,TV2W.W8՛2)%Xl ڵ~ouFSR8<)kULGBʍ㽒'IRf\d=&Yfr r o\I~Ç~‡Rn#‡nK%+$Xy0*LLQz159 u~8*+PUvܰTݫ砮zȾTxSu"si~friMl1j& 7W(meU Z/zy}Xqr{^MƓet$Omg@@HM@KsPp#k_k|mqnfº]}ݾ12](A38ߩuW/f cͩV* (Hn{aVO E86ߛcӭ?{^yR޳E d-y`˙h褽+"+h`Jg@r"u\R/ERoi]GgׯW3fQ?ČnQ;Mbhi 5TQ_䂕,Y 14`  Q'TFali6_ sՑ`zΉ 6&Pֹ|,W BR$H7Ŧqdb/9n-`4%L,Iƈ4NSc].x C"e8`T޵u,_yܗ?9$9z:6ʄHv{V !cyD)֔ P\^wn;;ѢiMҾ\޽\j4sc}NiP: kg 끒}gR#%IEKAhA7cF4CcW}BR@ёJUF-9+Z# L~uˬ[@@TR|?=hÍ`Q6g,h2 ѡL6(\=Ԏƨr61b5)x\wU}pxĨt # AiuqI%L~?OsU 1WmQ T,Jv>Bno|BἹQUuP iGP!Ԭz[ ރF%k ku}gtsaM nN';Yn)Ǟ\1c##$6#ftȨ*BPZ:'%X'5/V[ZQW!5*dlzd>/U0rs`J"s VAfmY0H!BQ:K+z o'0L'mTȗ *]`#L9'X EA)٣ԡti~mm߭83 ALcTb7%,dp-M^ [Fhcl eeX čd Xqi ;: nhhJ}/+*Aw V4ozdF)jPѦ"* m {fCAΛ2S_ b7ڛZ(Sќ)E0GRnCB!Ь-K*%d@BYAPS*(HnS]T;YuϪ J*H{m(`#Xo $$ڗ eFjuD^QBY3' Y,kOM2+V*{M-dPgF\' ܠFmłTM%1,4WU)*K; 'Q_r)`xÀmOlUY]n}Q7u`mFL`-$t>gAuP<>o:@C6&*fF9KO'1:#'`]@EE ]ʃJ I"92e(X Lv>xxLPBhb2$kuk+7<o 1t<)¬JrC#QhzBb9¶l^ A; IY7|4"d*KqSе2"1wP:k6@^Z@mDBMuj WH%tepǮ H H j`YgXbU3ce@0 G͠ td+!x݆`m BgVi'J)C2~ȃ 8vGyQwOEa2,Ƣ cA8MBj(crf:PF._tX3,*j$k4Y,p(m@ [SWp-Kf-w;$aS6i` nQ}A{F9KP07\qjh} 6yoL+ϴM{~r[\dg1A&nn8c`3 =k(اΪ4^U*WZsLڌQg5rFCk bL:hݽ[*3P":Y%\aۊLv0X$SSAv =>`yÎWVh]M`E^=_uN M~g!G!:CE7tF-)Ca E5FH[8Q\J#x ;?A-и ᧀFSМ6nmVܢhR;5֬Y u(P5j>t&1jGb+Z{Ok[rл[[bѪxdPi»*f`)o"a2نf@2/6O 4%鰀U2'Yk9k1[.ՂʠtiL  kȎEЬ]+^ClM](I jhqOqےw3B]wԢ`|JO 0H5zBv-n9;j)LUM݂tR6yUKr^~KgD! LY6Zwežbof^\'Yg:=P2, _4pb|@k78 Q6[qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 5G\^Zz6y^r͚׫nu{~ZVP~G$YO7_{};]~}r<[ ht|@Јu/u׎mzXW,ۓ pFǮW~? ݋*RaȪ6h1p\Q ^gӋ />!T?G{LBc! Pb'jpѩHl_>& Ds9地9x_%ߪMW/^y)_fy}zAuպU=\_<B\ .ɸ-Ǹc\tGs,bҟ!Ÿ CW w9>KF=R89TqnAtxݙ^ ]1\cBWԡS]]ymr.9u-<剗 f_So+Dͽ0Z)WʮUKZs=!=JeHQK͟}yԔ bit&\cs/Zw, DVXYY]mV/GT_Ω^C:#vAtŀrFca1wSP%[#]%YR1x)Y ]%КWW BW_ ]a|ð'~V}]z8]]Vs^H-p+R hNW#+xGm_^nM󺕳71WTxxn/PW+78V[xw/n,Ƶo#olJ.Mo[f v~.km}ǺkY=yW'ߴJh~z}N5JSq%MðL =xT·7zi~x/[2q!!چ)rCj֖rSmLl05dhABd@;tQ(uQxFKa+Rװ8#+ AG `bZ ]1Z{t()]!]9MLVye+^NwuGLZM3Upsgh9xfNn#Mh2 +3gpb֝-~(:B $DWL*-3[;4"$]pɩk)0LdURp4Mv=*o]cϽ~pJK .ˈ?R(.pL+}rs|th;tbwWPi +N-nK+Aҡ<툄 ]eJW:rE@Bvϲe{mv bi& ڥ4uҌ2Y}j?0=]}nckcDW 8B.S4xb. ]!] +b pK+F\cCyhNN/CWAo<.Z}5 ## -校zqL-Գ=)9ugUJ@.~ڐ^xCXsZ6Q3>__oO?!k^d3'mQR?J#x-M~;uu?Aisoa h AvM ϵ#E#y)!gFl,ȋyHrjhPt\-UӍ2bN9HysQ bdbrEFD|ip.\71oJa*q\kW|ZN&[gEX`nlM)M.BU,\srbiUީo}5PO2BkpY[|VжFRD[t[1`@b4:nt%[e.5fqdte#:8h3;'p`zpw<K(\l?I30qmJ۶2 /kVfz2PyM_I~uf2IC"vU Ɋ,TNtUPb1OG#L 49t:U]8^'9ҧ뗊 AZxyXVai(2晧gLiR* {)y`HE1Uko  R)4J|ĭXZa,y-tC)BIn=sG,}?ڰ+~`8-a@1}}ۧk)UP =-!ս5r ;Ϣ`uH`Rm) 0CP"$| DTw2(:o;cƌL"f"c =Ayq FRJje +Hx'ԁ:Vȩh1ZG$QeníU{'w5_RrYf1Ri3P)r2=G8. .rT%]]͍4?,N`XJCX !8C^m(*?:4g7ks>a9\߶d?ew@*US> hY>Ddk1'H׀ I㎊L_H(E(713UUu;d2M[*\i喋f>("{]m;[<5G8H\`U$@gm`J`c3.cVg;EIu׺y W|^ˤWQ6^ڳ5D$.u$fwթ ĭǶDC K(JMpV=W<\!,dr;5v|JtnݜwσB7ijCea*: cͮ]ݕXBrn6fUٯǬ8A kr:D`j)#x]6n=oe;^n{yV;zi{HeO(-'O'7QKhKTpmjeM{_3@& P:eJM5._G\:I|u\&}.D.l.\<8Tp>Li1dyETKR2N24X)L(vfS4cdeˆ^Fc$,DDNڎhA #(H8H,29Esk\ u6J'HAN>&cl EWnwa~<mԽ&!`7}kѧ XzC->W#XBP Ҙ#K|J#x4:,Z$G$!rGpT Ϡ HyoEL[DTT0UD#u8z* ΉT1`"{&CD)pqS"! wK9 Q Ƒ`ܯ*Ql̨!MU H]p5}^[P~JNUW=0o+bbN;e335$[8:V".$I!&QG3zzUQBR핔DJpB=9D y\7nݭ?GtGqYs|RF!"Yd9q1 i$XPg(IM`(=º -zdv_1pH`>~/ ēV74vѡt{+AsB9JfvS$AHK˸#63<3w?nN!i |+zpsOd2O{@F/\Ydڕ[ļgc="kvTbϊW>f僎2EOKZjagWU':~VINy,) ))8G=ãD7WjSjj4Q#jp9[N؊v7y yw2ޥݵķw*>[rym|c&y@VD02BXgd?&gkٸ`O7כcQ]mϵ^vY%+sӉ 02a6ӵ-Wy6L57rOmONh7s"D@ɧh0)'yܵ&u(Ƌ\/YbI֟,=lT Xn;f1I&nZUpʠak)M e=#vUzK LHb0`#n9J騆 t`|'Iz|l?v4CaU,%#BXWWHRiL|}O?r聴oVja<ճ7 `#x`SEPœQqLaU@|r`̦=|S$&IlB =Q4`#@BRʃp^d8zOl``:9{n[i䗷z 0O] &F9J'TM8,ݺΪ}"`zJ-y)u11QIg!**paY#s`5^ ,kjcy wD@4:&9@af4kƝVp0 AP$@2qk"{ 촥F9#K P1P94?3XSh&%%o>j=3.|vZ=,U%n``GWxz@.-񄮁Ӕ 6J͜o}S`&߇JH2g%.rjV]_`'.|UPcgFESG (@7!ŏQRg% 8 R9?ՆK^ic>YkT ܈4N1PapX~Z8[pdƨr¦v>zT!$UÍ J9 , P9&oPayk D8q%ꈅ+`0/JhDq,T)Ŝpy٫ٻ6r$W| _Ud} /,dLq,Gr29$˲cI[Vuw&LS#t"|5Q v;N 5   F|hZ<-^Y(A| 4 8UUo e9ԨTT)CM,؇* df"P>,4L^ Y&BgMoonekvN&wN} {EL@k.YWvVs\}[6SZx]Bޡc,'bM'^Ʀ?n/8q_mjuOH%\wX&jxa^W S 7+hoxӫ5jyq?܊yˇˣ&./?,*riQ'vOx˟'5Ǖr㤄r!v Pէ!-5;[kl)O%ެ)۫6 F^XZsJ\9F9vnNΚHr7y{[~6#Ǯ]z3e?llyl=Ssiq\;7l?[kZNMoYSǙhjbڄ|={H.'kʝ@FE8YtL R)外;(|1O*TԂvևRH-ѪV QuUosF`1 'fE;" vAg;5B%"Q/ƂW0DbQ lII4˫0P΢ђҢEKNI &_lk/l|y0dlRbN!H.+r@/_lx*E x8G?؂_b4Pp}IOzrtrv/S)o|n$ZeATLqh/6Yya=geBP{RtJƦڭ<Fܿޔҍ[(8יs̽p-1d1L.Kc LUH#l>֚tF5A9mrxUnp֠Fl$\ٚAG݆8umfoP7`:v0e=yztQ=M硵_Xh)9˺x]өNU B!?φ?܁&S(u@CT"'z$ z|BMivdwhK/7h+Ͻc&LqonǼv$vX{qqڗ|JPhl-(# \IƱg .&R`cM%;9>= WkCH WmfM. Ny( п\l;Z\ryz/ Nw 20mfj:0$P3I"}2l:ty$}9~X{!7y/%X_:ft)dElV@4:'[8~Wr^[v~Kn>׿+w N˧2kAޕمׂorY8]`{gg7}q 1Klc,5t&1.*UZU`}= OP^Ae=׷:|^1.P~퐬jeH~* =*>U+;FMjrbq9G6TIHPM+Skt"FBƩ K|d3́5D+.Da:4wOWBd!7%‘ D:5T:X0f(!RX-X TT_ݠovofE0 tBߵ3D˾N UMig.اs6Mq@pSADJk4; hcws2gFU<{f{^>0)7Cɟb}j)lE3o~cՏ"FGKCmG垜_ ܟ̽|"pxo6sH1AUU0%|§J>S'ɒOJWրHA{MJCP4 f9ic[:uV̪:R4V(I=Wɗ!AHĹBc"6{Iw^<^`9ðU .^\y{31/r7^,;p zBh修 4QaV :R.5]FײkDEdcQ ,@J>UBC,(Ř<DŘRr)utMw:ZA+F/{=I[N{X%~;[ WEX B[LN66bJ :TɹX|-XUǾ8hWO1` ZcQbA NDSlDq6QUVJw~ g}v/fc/־vy[طmvN"wY>DMӱ:b^x:ɦw9{F@3ōPF=v>i؉P~ҍnºJ4B"MRB5C'̙#lp[]V Aj¾TtxAHkmSfA զUmE6)\6B3& .{LG5_xrc&Q\jv. IF@nfS&`2d@A$ IeM9CᾆZ@ȩD~LjfNr5Hb$dރr"R;j7q0b>$,02DDH;b|~ϕ4zݽ):y4;$H^N7|y{ v>ROz3jeTCNAH&X^ :*TTڒ54KtWCTs)Պ(FIP!&"mډsgUnq/ԝ}|Q}Vvm2޳c}pN˒7& C<=vz#Mc31&aR*3 !Ei>HZE&ɩ!GTylҹp:7o+mhdO|D24iQ䄄ɄL"U>T@ʥ*&߻έ8{4?;s4#g{iǡ^tf DortR(QTjUת6z!j8l4t!ENi,PhM1Q(y!,)R](^&pE/7a68M?GG?=/;vDw 5INX4CX?{F !!9cˀvA } }u,S( SMeE 4 -qdtWUE*lfϼ^|61 Q쒉fJB)SH{Ϥ,${'ԡAE1Yӡm9A >h- ;1;5}2{2jp4%gc IX+>Q-ZTFR&a>dNBV!+v)_BUMuƵ\B OZ+Do\꫼bjL2 0ޫ@Òrg/nucuF_(bv2Uu2KS.IjMeԱEI|yڋN64f߄[g|"w jk A3SZe4ZHi]c!S# "BUǩC|{Y{8.`;],>tcqVؘisYPpUCQ9,$(\63;Vj FgRb٨\t zuR 46F~JuS>ۜ UO-C\L>ܗ8_zM~;JˍsZ=]Q̷Bˇ>PH枆̹Ek;u0dPȜH:kRZ $si: sf<s?sդE5tsդtz4W/\1)d@J&g\5qYin\oFsl̕r+E>zĥ'&vuz̕}u5y̕lpUá+&\@sepS}<9Vi4dz?Vg*)Le!ȯ< ~Mư|Nf'iLHJz- t12y\Vˡ,іQǚ5T6&0H7&q#y /O=_kXw,_[e:垿NfOE RiA>ӧ>>g:U* X;v],l c&I~$oVL\:)uQu ,%H1FAY)|p"&kuoLRyCS-Yb,::[I!"j̙bG#`2QaNCFMZPCFMJ2#0z 2W"0Z<s%u(IiJ$FsWS0d1X"䧵Xy& & _}ߔ7ן:85<ɹ~LzJ@+M`IzC:eXi?+ݤuMJAKd,5F;={ۦsFv'b֖\ |r;߽["l-S±y%_wzEBɿ~^h,ٛ=~K=#1",_hAkfs r%B MZ<E !m>ju#D07y+7,lvH ~s]p]>w2Jr_$u2?v䬧2 ld5wtm;3א+ + NUqUM|{]tbZ#oJb!s Jq눔1 Ʉ<[Ҁb `RQHNٱ90oYQR 97qM}vEnv^;$.%7urSv{{Bt)_sɺ&s 6W0Q1iT^PmMc܋nךWq_ӻu`X9^;atn۶{bTF@]*$:FYDT`#U>\ 5jdTf '9k5G<8*VTjq,3E Dމ2W#bmX|daw8SPAlbtpo ^$rRE \j##|p_AmS|Pt(%0ZU(h'2:Lz]X\~4w9$ vϸճZp(z4`e}&x$o@JJeQG7*Ч |~.aOmSk,h.)m[XoeQx{. %ǵPP+]z07q Z)QY,Y1Ps-dƽT)A}iiShRkG,#ٜH_Y|g\.o=MNGҖ-]-{s3~p <]8^Pٕ.䪳ǐ]PE5gY+ R,"j#) UE"sMUP* ݩ&F'GNjzus? ݰ"]ju_Y"񬤋r9;Yw^7 N˧N*Ywo__?B\Y ;&~#e=ǚb>bTCU:geFfoRwuBy5r6=P< ML`+~DGlδ+5T0琌NHU&dK ˱N$bNF179y5bvB(9YShggZoLxq60ч6*sx!vmbeoot3O mS" :4[2Ρ[: Ur,F˽\]砻 =oBƠ'>t/ѹbmv7OodY|{2z8BB蜕2 db]<1]k n2 @3 C @3dsfOm c9ZrĖ#f2p5ncQYcE :H A1)O1U=Ĺ(.B{ /?/1xvn's>K˕|*o1ꢇ:bo%U6ʻ"8*!M\4R%I;MJJ  ;^oh7Oôt8;"!)yZ]WA XQu9*8f  2*UDQvh#Pʬ"b)l4xJ?,Q{3@LڙrE& yr0{Nw:9r,txרEխOY9wʪ3!p9Y!ȠL$"vjj,k:񔵗S:4@-P3P+@}|9y` m,W[46I>wө\^ź.7~Φ]kF"bJ@.xa d42=1UW|* }F&ߐ&B6sy^yQjhQ~۳pIX{_:c@Bck E ͕M4]io#9+6X#x0ح^`T#h %KjI.W50}ˇRRi;ڥ"`E|9yxJ|0 u .zݖ䕫s5革c,ím¼ L>Mu6|Y?JFolEzYJ0___u_tE :Ģ=Y6%3XezE LpDS=YnjȲK8H#' 8 Zx ̂V"ْ^@s}hʪ,8E9IfY,ir[]g9k4v!H?S3SR2 rFxv>+uB)LEDG2B|`ؤoh(}٣lBTzZcb:"9msA+LO>bi3+e pꖔW}bα,* F㝱tIZAI36XͽN6Ko<ϱͱӀ>Q)v Á/4L#ZӻT:VJ/ڸ/W+筭Z_ B#8l^/%E$&s@<HcZMnpt'4NDQ{i-B2y zCc6!f#a >rT5q߿\ 3ЍㄍW+*Xϼ_^CBZd?]_njz hv7I:jh8]W`qqEa4V-@dt%xXA DD ~nH 1 , 8nad)Lҋ 2AȠ,研څDo7[]+%H ]jN9=@e&iXsWwqPE>/}p^ ,{52E*S՝g)RճʼEV?ڿ 6a`rc1x0rxmr[pg:k{".Ws\C\2AzdDT69YYDr1ScPĄ[uh{Nq<.mᓙΆ_VBv1RknGcbY(f=)|2LEWvD2|x!3Ce5ڌ:GI"5slN\;G $S$w9 FAbxn*jiR| މ~x6,rawv~>EW􁰗 9|YSth"yr^ ¶P '-*bq(=Y%򱤣C6J;P9 'R$0DZiE%sUΡK<'h¨J|_|(U2rQ,,rGNk.C^rYJd.I$ +JT&2mM#ʽ&KyC?;BUl98QK 䤔#0㹠6M9%ȳ!9alw"5̀"l%VJE!$1B*i-pT 0>ٖ8'*l8p2ƀiKje]eMxCL <8`n{T:z_LٷddJi \N*W&j7нh*qc?0 B 1:G|SYdyfĔGZݙo9B[S!0<-6eISc1fRvAm &͵+dڶssDkX5[ؚf<m!mDoظ_wCqk{*?MGu;yR''Dƕee;R2#ukL0{, 4 #@M!E1R4$#\2$de1hlrqbvjڢe-z#ؽIFd @zR8J{#MTVZV IKNn<(8j* 8*!2@^th$dQ 6jd"9h|{­sN:;F1]C-l""qFޗ& -hE-j{QT#  'snrƴQMIr+s3!p57$p1J ws"^_ =@vq^&WbiCj.Bo{7J 9GZ M ,{O 1eXSxmiGW/> 7]|y0*_1m#_P`iE(h%bdM#l+,U ,- (eF,_(9t<9tHБ@' lr.dKh) ڔ'΄,T> q5yx/MO˭-:EC2HP(!2#GI.H Q! %z4vپ˓) Q$y: SYf*:k }6r4PKUf:!%? qKxǨNI ;Ea!za :hZRHPPgMO(^_DKaV HrFVH5244a6ˉp OƘ~";XdNs2m)\Q̚)jQ3]F-K+Esi3iig6ڈ%tޝ..+t-'R{]~tn6_h('Jfg[UFfVV%FoߖXjqM`CKro_<_,V ƓOW:IRpG`%Ζ@n鲩 pMeH6xR'2eG=߶9.thlU;jS_$iu^#aE,8A1&WlOtʆ׃8^:g<{3.هpF`&${1_>t|7#MK G4my^ v6Ck0nzIi|jXxIbi?!i_.yA\UQKR S/SmBT6 ;ҌꍗL <7Pvg7?&G /+dVȐeBFL3wgk:sQ(;ذ6yU={~bx%J5!eC>c6JIMsȃIbFFo!;-hpIәjڕL0?4cvv~w c;/slG-.qQ1 07Dgxd`\f2TO9 S1jE,(pq1׫J ~٩H_}jzDO 5__?1u:n\](cs\ qr5J8([ǃ+ 5 kr25 rmwG|6O = .w(<K^/Էźd8⯣Ѭѯ`m$Hy !-%$g/V14yUl:j#KFuU(*,U(+$7ϫt>@םhEY-tU=:oϼm5zpTRCTZ+Ԍp\,%g@TƲB֭B؃>W[wveiZ_;\i- u*$+$^fmM`ECFI'TZLOoZA)Eo$\sk 'L'c! [mACFref~#nNvPgy/ ˧ǫ#e-;z*T BpZ{ WOx! 8A~:8аT^ TC`3=Dɗ[r>Jhퟃ_̄? VòL4b?,tj>'0 )GK~؃_j,k Ne&ۄٓX2| ~s:cVQP lrVM0ٞ!yyk FPjqYZi)QFgS5]!n~%?iTɆ udl+|BS>QM}qkW]jlicY}_JWT)sv0ͺB*YQ5C{5o |Foܯ%ybs}o$?Z\U~Um'$\=gf7gswWKeei ZY/$}0}"miGncSz[ϸ)7|BMk󕓎W OPWшH/P%Q^w{)-YnF{ks=[vw^g%2JAȤEkMF%Ha@[I) mjRe, }8 Bg.%MLGf̵glM#/͂t6^󁹣󃜥hªqXۏڧR)ija\ڗ΢%zۗh8|0(,j 82qT`2KXL9ȐXuo"]^qozKlJ?{OƑ4&1l&``MNk)JW$%يlKU^ޡ"s!RDTT0UD#u8z* I2d =!"}a .n;Ŝ(aHτ7p& #Q)/mu ;[[a{z;D\YfRMONx9P̴(:8Aܺ"|#QKj%BNT*hI@CxٹߓVҳ<:]HjbD{%%Ql QKEbJ)gȑwp@+E~(O׍HxvǓ~Og/k#0!"Yd9q1 i$XPgd{;;|g}>E3UCpHC@=J([ڇl\|ٓOnt xQɸw&P/8¦TxV ChR$, 8$'ˈF(ݓǓM\Cis=*8l Jѩ>!{^ceuk\N7UZm7+r?c_܏kr}0T 8?;LXD &#|HW=-,j QNoG4xT^d'mH~u&U iڲ1LikAյeo9g d:jn +MEuk'5 Q25.+4:zan:4]<w5M,) >{kǦ֎zVgG"`h@Z+ցpAH@'Ê02MY$NI V^y?MOWM=\g9`o-q n6ШeFQ )Bh]ͰDžBx6ةƆeö(PQU/8G)8s_<"`aPA5ЪTV"5y}Qu4٨mQE7l2Uś.*Kմ*혳abwj-ҦU]t|=L|kqozqa.C\= `3ǁKAA)w\G+'Wn=AT`A qbUWB :P=ɕ$X.NlxQ9=m/J3vq;3Jr%CXeS¶ O%RP1j\aUv)ʛKG ڼ*~_-]Š?)l[K>/_w* y.MaEh <[Ƣc\qJL(nos10 9[=2їb$^V \-_P. ˵B'#20/\CŐ-SN'H8D$rӥλN\rUxUR=ɕ-%_Rʃ4:;.,d_jrO ď9=F+?ŏ?yULSYja5/`sH&ctL"1A[Dwqs3"\&h+ Jd$ӂ&_Ҷ\x-&fZ&T1/2Q_}.3Cn <\-Wdɖe.M,]f)WՃyf]:ۍٸ %fe Wq[-3f<۔t`GU|9uxJ1B(,E2Ts66OQAbo|EPkۥnj''o>+bہyMesHnz%ȾJZ5aI%pA%n[Ҋ!0j,0B n#^!I]%a]i#׻Rh}CXګgd3j)vQ4d4ĆL)EPœQqLC` H t(cSQ^! \JlB Oc*5(K)µDX/5^#z XtkQ7L X | zkڼVQvkWwݨY-rGsLge ؂f0 MJs9D@#A+mAq%V­%^ V /:<r|nR5(#)NǤ"2'uS fk k%A m:wǞnjPY}6LLsNYGC4X  La Q)Rk$QG8ǜ=? 'oϚ()pŵ% EFIpʌרsAf#An^J?HM2N -06Qdκ@@20$*vR K @1 ءVgJ9)%^6p>?. &:y4C xpaE줨OC )@B!V,pXD'L J1u,å9,g@/Å@0C(X`-IV%sp Н9mo*87cV~~}7d"[m{Vrrh1TↇS+>*)Om}|GG d줸5uQ5Olg]~~fY0[3si>Gbxv^-Ջ]2۫DGgo{FBm#I:Gl0jfYއ5w4i\ N\|g]/jr˸!b,7 JOo>~1_K4rg5A#&+/|I{".bPI)q>END9#Ag2{զGGHIu$F>q)|M"F Qk´HudLkRJꉣ{664$oOϯh^ 4Pb#Y%QT0{" /3Ebi!N;lNlْ~3[ os" Y+ΤSb>W\쑹"ᶣ^mI&钝݇?j0=Zh/n/kP::tXmVݾ/]k(sV"v۸wdv`^7jv=Ӈr1T)1AiGjd~rVk/;^s%J**=vqE*WIsyXr#_JU@<K 20! EqkXLۻm뫽qa4e"$W֣ݶZe_pvKjۛTno"piωjZ6KXv/i=*D$4ƹ1" ti/9Pg;$RS@3|/t9;+e0X娒_.^J"3T? u '`ԋI8/&DV|*nR}O1ߗxyuݎ[AVq`k9TY&u h~ޭ_uLծnf⺻wGa=6M hBZ"z b1(E;-y0"VR+ {zyBc uDRm}$]E1>eM?'v8rI*tN hNefWA%v;=?J(9+2+NO&gSb[d60M@0X!Y ~ )2[Azmo^v\^o@ *K=}RՒ{qOYH::$ajs3Y^7Lhes;g0 # j e!hat'qaA%22V\AvVv*ܯ i® mGM6O@c3޳6v.[64 :ʛr.]y%Ww Y^:KZ Z1.6dy"m4.& TZփa9G< ahׯfZݮ[{=/?ymzR0׼]s͝~A֫ObtJ7uk\Li5h6dڳ)_p=:e;b.@]*ӗKmn]nm#4޲%ўf>3%{QkпQE(1 z)%c+CS.%J$x-qDTs9H3Fz3F,B0!yd!R&R/5eDDL ` Xy$R)8ָd:xE1r9,X%dִ[/O> S#ӡ+(-_/=g1RaV9@sĝ2x!;>49,ZrS' k4=ax`F{Td.ph(j.$ n$/(Rnnv53t}9uȔY|])N? 27W"a4i2n*Gucy ZD2|hxVe@-UjӞ"_gANA`)g{{ _·qg+J;r{_/e:)UL7H FJzNJ5r[=\jj\sc/y2B1ᢢt2:W+8Icr ŎC&Ύ94gԟa&佌d7I, M9lq\[$|r~xTC[UW_m࠼v^i3 =E`1<pO&TA7ϗ<=7g7sٞCa'|8níb oMCw|VyƬtUnv9RrȨ:C 7.w8?YnSV1p[ܰOmMv|,W?ߦ63a׎|xK׹:o_Z"\6nh29/lt՞8nuN:r9F -WZ}u~?>h7c.je4k ׆f|{4[?gU$ƃ]_n C75PJt^wi%z2G쯊)2 " og303{][xj¥{ӅO.Wf{Ejzh-Jl'c5j̑ڲrq֝+Y B -07GO<׵}MSlRi}Pz)< jCb^+gX_.rֱM%hR(i8n7. )8M7Jy2扥h)akPzKSi΋8Q'&Q!ńF%0JRҎc)*Z0By,qJND-vS"QJVDP˰Nb(ɩ9,ML DeHY,W3iEEiǤ",3LG {p&:-blig]}l8s'üs4ʘV<(80JQ@V3 㙻{x \~+:-3?탒ةU9`Fr`|Ă0B8]0q8Jc*PID-A8 :0] 9K)rqbZUn)Gѭ|R:_-!xis(R(.<#/IU /IAzk+"3Mr?>YWj|P.2FqYA(-8 9kI@`+xKGNzۛG9XeW.XG1 :Da"hn6^mnT!o 9d'}94]NTk>zLS*V!⡴홎6$T<-BEU*TN8S%&Gwl"uXVǸAzla8xQN6S.7z g70Y5JP %3l3BDN Hy&Dx_5VD(CC J*إr^єB"Pf?)u;eBЮL֝dGp^83ǫ_][S8V 0RW>Y +HIU E E U4J.d.YubtbnHTp F āw0 Z zZ nyX! S-v ]N+Ύp>jT2>Yeǹt-hTZx, R S KyKOT۔AmQu.f}ƼyIpt[7 O*K1D]Yzԭ>v}ؕw/?'ʅ?S(:;+.HZI""UW|P_LpÔ7V+l~|{ƹf̵] bg/.F%8al6s9dcSFXp\ ou(w{נS5pPӀٵlm]ɜMM|ndoޕ5#vtI8"ىNv8Hcc$2 JUЖRU}@"3$%9;Y׶'$v_xa)3RB`<`Sdls~$hYoɐv$M F>,=V "}$V(<ŹScN3?thU_!P#vӲrq81E\^Y44%\%S ?'U1ӒZtVY*c7Nhx= i|ZXöŖŠnv/g-:~P ׾x2V Em+'AQqʺx.f@|D=(D pe $t0)O*2F$pÈw@Z5BCL$]T'r 1|24V&N«in$3P#y /'=r^ύgvq 8>y\fԝVn 89vؑ!@Q<3,,Ea07(QP‚7bZ ,qU$D DP"TH>5%OF g"*1 2K eI R[-! %ڳBW=s#2593`G}JRђKS'Y`q=|<ޕ΂THpDѠ#%G$'^OҜ\rDrBIqFyES1-*FILyEN)kRC(CrʜZNêI|2Ina׮_vrm] }Ce?n״lQ}OPՑSwᝠ|Rȓ: APDL xc%rN'r`'7jNSE^Ag %˞PIn"1MԠhu@ it)K$hM5I,LGRPnSJ/D`|azρLQA B!QmQ2l sS P!L.@g~"i?  2O}D1.@)E/.Uʜ5CHKrLJmOWŻɺ;YɣT:>rGix%Zt=NY5x?w dҵ7{}lEzl0ť <~jb*E,ќ8z݉-{[jBH-,N:$C/Bh 8-)U1H*T D*iqW)2$ EW6hJT8 !2eq( >ŭNU̜݈&bW;i\YKEQjRP>& xGR>Ykѵ!iRFr6F+D1pds)8< lzCYȝg ~au"Zuяȥh$\zrd].eyE|@\ķY)iP.Bt"z Wh0gl*ɹU*[)dW/2[8'B 9Rv.pe*[ WLa~}Z}oaĆ+ڎE?Fø1\۷CT}_DZ$Bl2op_GРnM9ԋP@uxOx-N-|!zQnwz_]Uh(ךY +-Ũ!GvũmOqӼ+ֹ!nڦV0LZZI.=0,M"cөr/[ _R@uNp3ZZru.p'UWsJr@x'-r2 N0OF)ͰGɰ4vypCY<*eP-8e ?F'ͣ_7]/s$shUofIJl$eV2`He+/-wi?.`T?!z殝֎D[VBS9 jOSQW/vᬋce ."*-@ L6o iuoՋG]:7r[-E9`eQ)7빡gb~lҏ8GpͲ_e4",V1`V!@ޛݾuiLjܣ0.˿|p2珉0Q 0{׽GZ+NV-Fx8N>h{aҿ#avt=3;y^>%t\bo>eSAza1_g \F_z!j3],ye! d_^E+Ky \( \4&˿ٛ:Xo-7L?I&}LT"*.2#Q[vݎ'(6ih܈5jz탻u=J"2G@rN42dΌ*;yLk!x pFk`f2&&[kĩQv/r'ƞ_DH34xբDuu0I}dD="&~7lvoDEiLkO^JJ4!ޯ-9PBDqYc9,61(Նg<^t3cJ=;?LwI✻ !UcS @UHէ`otXa'yD#xDu<=V΋ŌFK]4ÿ,r$&Kc›P9_d48$$Ӛ\`ܶt9֢*\YQ~* dmnQB[ccHjk}H8C[L[}풺D>ӻZ:UTk  H/r&TңD57k&jҨvdI|z5EGr*\?Zo_~ 9C9F 5UqzE Fetjݼ}sj&}-R'fw)^pC|r"+xW˿Kau q4hs T˚*frk%zu5zx%iOB\n&&$~B:`A8@B0E>rׅIg~xzb75h=ƻjRȴ 0ǷC?qxm?KGzRsmA8,\u69||DxnH%%a y/vn+P3&]mH}ۧ:eネKa<$LN4BjF `cZoeЗ]dk "w{HoSmռdT*9zײ:e~ۥ}TpvXJԐYe'(lrx[}R@ pjy-FS ɋH)S>⣭*aþ-N=-PyJ G#e` h\H 8$OHbז2%=/["Yfչj r ok+6i;z$ N4RNN6,ZğӾUKc? kf׽J"_{ǒ.IVEu}CWxSW/Wq\6ET;~2(bKq09el6s9dcSTx\s#SM.:Hr|t|1F@m'aIJ:q=RlZ%bȱ˾ cCZ!2tX7{WFoyc8\-.@ܗ悂YR%ى Wd{%١lY$]%g!xYA2P%FpR?{'1r-Ҩod˰hE?d6H=JDy^j-pWp<.J E+ !mDGAu~f/G_F㯣_ϴow9a"~>bے}Io2yS}(n _U4ö vJ!CgzHa^P$X:D \._JL҇!SilLԠ5D2]$GOOn_t*)%›v2"XYgL%U^ T2jNYo7S4QIŹESr&0DI):D2TЄ[F^Sz1>Kl$}Ԃ+9=1h(dTH9o|ab쾕*W~Sa扏 1=k i^^}sdY0;h0gUnyHt1;*W@=$H D 1˙]b[j(%ĸA!d>Bu,ixTH(W`݃׵s\Htz2]Ah6:ErRA[.4@z7{po%!/칋D2 XYLXKf$4 .Z; /bL}^Jrd:jnߔjm./X}O,lIȋLT8msS/ yzVEJ񩐍)*`LxgU*ku $J$#C2՚1 ;bl{q+\.'uy{K'ZD6EmTn+z,K.jC4? + 'YŔ`A,#V(VY٠tV*OiPJup> RAЈdAhf"J*ksz)Jj8M\"ȼgJ#ܨdboE }ŖKY"!X(G\Bȝ^iTa8Z ʍ*H75`h0HA6@")-ϨhG&*4DBB2`y46BYAS\ % Y2%ͯBt (DQH4#2xR.lZ!,Fn7.&?r_ГZx/8i+*fs1#:6=d0?Tef[w-a3|@!/\lz1"2ܾrZ\D"~r";0pKy-XV{%MJ8*Bh`B>ë*^HO %1[Ŋv+eXG5NBVbcZBπ)PB|:e}X$TVKQ1hp]t-x)B={(U[a->q-5cᨎTS986-W<nV^yKK*mHY(}Ѓd!8Sny0*!%,zTQ=WHϬ$u<)=a4*%Te"F<9NsTQBaP&yTJ;!nLX_K(eJ .|T4ϊ.gӓ*M8݉;M]LGaTr Ȫ Mz1Z󧨸9h'3~}=ԓ}|(]|Rhq¹/BLZ! B-@| I-?6ÑQxPnN )Wjs7lu^\D{Ȍ]׾v0kp]џs7 \ѕ/AVtV8Pu1,Q(gv\ IQtkjY6luא?7w9mtfsp7f!e(cn7 oo١畖!אw2,g(7EwSop\fm]]Iﶭt)Gb{=Ў>LL;m+Eﴭc|[K\=GN\wSH\e`D+ Ed|/w+ ڷ?qMD~O7,%A}Fl@$hp3dN0F(5GOgZ(4n@qQ^v#SRnB`86ɗA]tW`~oӗ,oUGzII HpJǠCȎKIduDhzya^1㖘1I@?A9pICYvfMkJJV6y"q! 4aAk əy=",9AŠ9k5~{+/O <<>[;nvejtZ 6?4f UѧmhcIyd!xWHd%eUB"3}D!hDD8oI Z~}(~Mӏo盧V~NQNQqN V_$ǿM&ώ<^ɧ|Cy绛Lެ._Lƣ8uNݏ-yirَ=A_4ɲ:NŭwO)yè~#0?^O-޽ϓ4 ƣRp:vOnr_n:,Ӹ|qru2n8?Vf$aP[K}/b0naD_nF[w߆kߴz-p)|qs'd7M\8b,ţ~};N׆zq, ڼ,~ϬvAo[?@:Ӊ7HTٙNi-(ןWqo}^e, _8\wr1GB3w^HpCNHSwZ4nwz+˳vqBN?u< ){JosM_DP]V홦f*|$4㪛օS-LUvfn& lvZvmQ;0[I㡻1SvdXD؝w_Ҍ\r"x<ټ0G`V6x@E"s)4A,LjFgٻ޸n$WΕHVY;`& 6zE-حGrt}ر%밪X&yhg|`Q?Et#6G‹fE-UVhz!rPM-0:pjmcPLR&LM7/W..vv~z U#oPjyΟle;;ʎfCZKֻaRJa"2W9a_.?[g@;*wh)+L UX_)Kv{R2XkԚkkJԩ80|j(@錁)V`}QfcK/zCMVj^5'wrl)Z ̙I`BP c46zp(آ;E۟^ .'C" 5!yChf=E P&랬AVQM9+K-o}uh4sU4BUk̄bD9VU'꟱V[id?@hz.WxV2DBk@"G,:іkEg1'AWrWiUS %'gEOyQ/ LkQtRuMuzaʍo+Po٬W!?\u ݓvxP?k] "/nXfaf/,J^fnEP(hd,_z}ҍ[t CJ2 9 4|8_ wT (鐔*1Yh))pY?s;N ^Tή1ڽ}-Y'3av$xbپn5M2=iWɐ7P8ʳe㝜o(Ԩ67Wt&@]O_^*f_-y|wv92;;?բ] ~}Jڑ~yuØaV7//2n.p~vA<>=grhQԛIu\[unˍl:L.S/q>uχ:-?lY:x+OBG__~x^ZxO?ȉ'yW%.t ~4= >|6_^+bȡeC{笯e}Ny͸1wD0f푻 W߽,xjot.lF*_bM>h̢ka<֪YKqŻ$lv6P^{m౑P&`̕fd[6EG3Qgd౲#7r3c" iGs'nQ_P( ݆a;@iE=lGԬtSQIIې:S.)Dg >7"j8!:²!:Z,*w(CUE3h[R JwfQa]EJΓH.)`/cbcňha5B\_p51z|9./UGaЏ&8.53PКpQD֪V1,-*iْrvz!VG !h 'Mrؙކ,gP-reԮ8@l9 %&(;+-c^*gAd8VZ6rrNGH/DdJlRȻCr lx^;z umR*]:gm'ȄVZcb >ok"0O}lj?M?اLN]J1!.'V%Ǟ;췪N&;i>qeTIPTG%fYTDVL!& 5Dǰ '˲*1 T\R\\v9"%`4/]*ERɡq eIf*VBχ"|m ,|7i*t2Xq!ӕI 4ZaT HU*0Oģ?Uƺt~a(yt؞r/sx]tPD ; HM 6qOl+45bwߗ~W, sv] u!34ij8Z?0PbًP6ERY&Ģ Ц@b Vt\EKӭ|NUe)%8&XpTQ3 #*%'G7Dkv{/*ڟ*]r90x%6DU 6,}r, V@(FJ 5" '1iU]Y{{KEjOrYISͰʰfjAY ؜ T) KrM9$y}%+ N_?_ke׉ -/]_^ ;7ug3,*㞙s;r̡sl~XvL=8T2t4]J1uȵvAɿjv:|'T\ryEn V:6ljy27#݌6WQTBXUvT )Ę)"ÚY+@ )aׯj#:RH-&jm@5 dTQ=IrI78_Wꄡz@Fm]MGkХ}ߌQڐN ] ǤbHtWbV뾭6.cHb\t\\rz!Uu>gg<]]j`Ѯ5aً `ODq1[!St1s JH%s kЄb1yAAL/mRAKڰ3M ^WK7lSm+k6Rv  H\ɂ6:\1ZvmmuE :\1i^pP*l/#vѸ}A람]{n4︇vpwCv;즇v( ۢz>t59K~Dt%JUkX誡:]5'C2[j,ɲbgp[%:3˞*uߡF͙i~xe7{٤5K4YIȢGrtFĚr&l1N( zƼR J m:f-Xqϳ7*܄ᅢ萸Na`qȵPs25ce lkAoaREƚD*R(e_F)lp-E)Е†`R P)oU$f4t ] Z6jtPrGB/+O#+#싮U<jhZ~<]5Ltte J[PRx]7=X]\]0){dH?|~şgW˗fy'8w򶙂XQ ޽fCzXnym1awuyqRVO4ɦdݦD&E.ؔ:]Vu %LZNoxZΧM{['̙N1v!1Y&}8O) 7Q\(5CK6gQ(cr5GCWWVmhatPDWBWɵ&UGsq)hơUC v}+BRhHQӦjՑfp@zmnvƭ `kGC .¶Ӵ+fMMl#Jnpa4FpCkaPtyVqz,]5٠Fۡ+`1vLW5;]mwʹʡ_[ЕKQV#+;=Uy,t%hQJp]!]D=2d[6JhW -骡$=>i^):0~RMfŖz?)abMX;VĚ1vjJi5lsbtDcrs4hK)lpi4JaC~JMRJ!Ze]54[0itPDWBW00 ]5ޏftP?>Y~L#EGU|7˙|Z1ѴNfգa4P:54nDt%юGlpkh ՓtIn ZɎC;p=+v^v~4tg'hAl( Oto|H}65]mԎBfG6(rx ≮n>c+v:B 4R0Ukв }0 Z~xo/+u.!6hj9Ly$clв uvDʟvF5OC?寡8t寡VGfLt ]5<[UNW :Lt,tpDt{ ]X;ʌZ3xZC <ҕ%W{ .@GG)BXlż=YѴm;nphh_6<݀#M;ϡ?{F] azz $f da%k,,*"o0=HnYzZ%SU=N|t=n6(_'Y{%PjDo{ׄ2M:S`j@o?VϜM# ZGތ۷h%-כ%!bs9lf۴&֐"m\N7&nqwwl`>҇]|l>j1S^Z~7_GuÇ GGy"Aywcn3p@_5+w3S~b0~ !Nߠ9izWnYp n޾X[z=k6ΟP챰޴ds"oUn:[wG־hb ;|ӿ}C2,=5V_fD30Pym?_%.J%}Mb)YSCNrKF.EͭǚC23ج).\U\rqιRVM=|O0v:NuFFeuKZWʃI:,ޱ᮲Y;jL3RDs qkPUQ*FZ0I5mЌB׶W=s\!?]mJ>7͛sHXݹR>tVv6CI(R=%IEsmOAa2c1^#1+>ol.HSZLdfC ~h`D{}w.Չ&6eU݃6\6Uʒ&Sʰ9w Faߧɟ|4 4f\{ӸyWߨN!yD!8<'z]F\RmsqƥcU 6Y:eDyKǜ),~h'! ܼOIz̪j7 9ՐJ Vݦ tft=§zs|Xvɷ5b57K[r#c##$}F_oҋ) |.`5 j2I2lk6R jR&b>iyRl$UG]ЇdTWunCME+\f !(R$Xg dGhQPKͺ#o'0L#mˌOhBJv lt)K!#(<4Wn<wաmӊ#:CI0kD(Qbʭ:<Īѕ8Xɠ-]. +#1ɗ<q+YgeеYX@WgDjmQ2n_? V4ozdZ{jPsBE(DjG*qh ĝM)S`_baN݌#p1Jw* iRTL`6/(\d#$ ŨQAQ)ͺ iWP^j+$_ #L6B@F;-U(!Ȯd܁YWBnTzC[ȸALAAX'󭋠@zNB€(!2] INbTy>MFݚ @E&rZ%d^S0P>K1j 32]Kc4/= Kd0A}0$klmet<o 14uXTuaV% 9ՠjK4-!V1jdxaPMRuY+ QHkdruve̻fnqIubl[NT-rEM;qt֞EwiXTd52+< jݸVU*"> 2߬n$& ,/^%Z=USڱ`z3be}l?lL>]Qs Cef--C]? ͪd4QG]Cr1Z+ǤM/5RVޮ bL1z~3=.2#u68väDy ؒUN"*r#as":Y'%\0mA;B 3 R))#K;hJ{HuXoެ7a7VqXW Tjׄ:H' 9)֓1DyŰ*a‘8+ƌHi)7rF1 11t \:'X:W![QٔJǤZ54 hc37+Vj C5kփ*HM3|ڴqͤd &Si@rZ1{ tm3=ݳjyA^ɌS<.jjj kJ (Buժ(Ad+J [̀|͸.O34%0d*AيFp*q˱ %׆.I2[1xx@A4T; c\9 ʝKc1˱UZ(݂PdpiBeN Z5wmT4XD;0BN>z57 R DŽRYWn)vmj)LE- LlP!U-,9{˗oBXBV%ZQY_|^Q^o.ۭuח\]O߶|gso6\^ƀ\pyow1T8vq>r-W@Qrۭc\  wد/|v6.r_rhXֆۼ{~uz},{`O7ɋڄ ?$6y1#VWoZ=mַw]w'J'ڮzE%mwmxgh㶭vVpGORťpb yy%K W|Dwp k=PZS4\%HJ WbÕp%+1\J WbÕp%+1\J WbÕp%+1\J WbÕp%+1\J WbÕp%+1\J WbÕp%+1\T8p<sA8X^RWKkm^'L\r:{-(CjzɅXGjtŸ{&IWLhJSxxtE)xtELtjtŸIyw],DNƨdMg̎qA;;Jh}J3/GWvۉ1rFc UmhUoeo%zUWۆ9Q8]1.W-ݦu%T]PW$T+>׃] muŔ1޺L]qEˋN~:9}_%OYlrQ:fu7^IxiҜmgfA0&ǦM"ɝ;,A"ț| 9YZ4۶AfiZ1RbuNQ`UUғ?$[&(w"]1pFWQ eaU]FWV ѕ:EWBJ({+UW#7J|d‰M_7jrxE3¸tc*]LUMPB 1zw Z2ipBr1*AURT+M)iѕ\LO10%`2fD5ܠfЖL WW]] C&-`! ع;yCψ@ (++Wum-HW]1. |]&PQ1!v=ԮrZ5\Zt%JוP&_u5F]1%'U;܆[h`T~TVdQ*\LtK\x:^oY#3YNUݽh~(=h\_wy|k2[߹c{f{> _d&;xkR ev˫7)16ݟ=igu~9? #ۈFAGg3 Ee9>}y h]BDC I(#nz[!Dw+Ӎk } M\@q u뚸K2ZXv 39r+pG&,@L$E-G=-&GZZ8E_z 'Q-aZ8p=? _]Φm3w>T+nش'nNh⣖*YMٍS|# }0H-qCm\, @ ^*+KL(> H%+s|"Aa4ɛd}4g}s|[|*_u,]<; |̭I񉏔1,wIu\r~~JҝթEH? K^LWKd |"Ɖ<(糫s|QS;Ǥg`LkmK3ujW ]5OvtHE.u'&1=g϶.w0`y3~zܴpu}oͯ&^LL&o_bx.Q&U{l'&g?<ǧ>c$^%Ϻtl:bisq:=_4iN! ͯgwWk*t![;{IFJeW*++s|l&0<qBA5bG ;mT%U ˮR@Qe6b7{x?2ݛ]4}4 3i$D6{"B1DȐAbrp:o'/y1:MrqdɀܥmZdYVLꙞ6[X{ո.I )EtS\?D*9G);O|"95۾ygL^M%QrN)~DoƘGM (&FW]1ma"@UW#U rB"]1zEWB߼#TurtdЇϳvpyz(!CWPum-"]1=] -u%F:Ʊbnm+AOCZݣrWY<]T q ^Z-rl,5f$ش1_$Staɟ,F&\*ѕ13: jt%Jhɔ+LUWcFk<([3&AL)Pu)Br6#Nl8qabG(Ҵl=`E5` π%j:D23wX2UQWl41!K~.] nZtE[r8_u5B]4-qU] -?f'>T]PWriGGW)peoZJprO*  8paଡ଼ya ]Qvf 3?|]1 Fws}hu%UW#Y x=n+L?ri{V5M [jCu6`o߼6*J U .I ғB$_&hp1tp>#!7ڇ ."]10 >jfu%n:(9Eu#5'Lf%rl iZ.{C=HlfhM5M]u] nZtŴ4&^1;LL燿 = ;,JW&h"ӴLN1;hѕТ/]WBj1ؐ銁&5\RDhS\RJP+[׫XSz9 COu^՛cWy8No<4}RY`ʣ_ˣ,MW1CWj[筱t%^wfyC].EJ:HW l] U+E[类+>b t?QsI^5M^;9Uje `0EI!{ғ n-I!JO CM G"!``o . -uŔ쥪CҔ]UBѕFҢ+x] eogWՈtX):DĉK\|nEfdX*Buw&t.i? բ+t] %UrcU`cR+v'\&X^Luu0"] =z -u%^҆Ou;ؔ銁CyFWL x] eiGtW]DWշ)ҕ'=b\gHv~(K;0"{i1Ie NW-:f\7U7(+B rwm"`USPM'`$BmɟPP&B T+ ^"Z-b@t]1e,mAAt"]yrdW@}<^Lo{zs5ĵK꫿7J.m ~6[_/fڥw_~l_˅z=t ⿎WLzqs:Z?>7] hy:;Z~76KsXՊ6;a^f7fsb99n]krN;3fOg(Rx'o CԿq`HU]#^AWu#Hn_߸WRoڳڟ{PǏ緿`B3y żqa=D 'b]~<6N6f~i\p~{c^=Uo_}K_^ރ&ҚKr{v(>ׯg\mϕ<햺=7ޝ3 3}]gښ6_a%&~QrU'g)^deߙ)7y9|:rvƋ2:M]&dv2(RR\-|???_U6Wsbh/wͫxo eq  F2$D0JsA()@18?tFuiƿTK1,$"Tì0e1`(>Kq{Q[4q9u&(+qu [:[ɬ_ޮo!)W p+,m^*ȶ WOn.*Uf596|\IXˋ\h~TMG+VT/Mm+I1nR BHeoo^["m& -5 i\ʖ .t:̓ ._~pagڹsދks(/BL۞tcLVg)jf07.&X+o?$tFX;#E0+MCVc;'w%qQSմX*/)_]nh #3X")4"YTvKCtLT/ 0m fuBC-OXIH^Ĭ1pU!0fehNnMINbVnxs >70j͖z.'!CoL콎  ͘Uцh릹'g}rA9z9M ]<gV !|5qlxÕ]teyT9~"B)~P'70Q<]dhWGF[jR-Ps? Wji1|>-k_5lWIB`^VJv$B<[wU2xbN挊mێel+[>JKs]kt„ :dwac.; K?b㨯RL$Y{4ɐ]Z:&iJxkB[iDRĈv;EM1Omw#oc&h_2A_h礦4V8,Ǭ]Ĭв0}Ց:re",[PIrZĉzqvZvjBө iݘAvX6'0}9gd1WĬt}Չ ҳ-{ _ʱ'XƬN┠(Y-6t91o5lT4>Qd$~Pdb M'ļ+ωj,,nٳXrq?ꯋ IȄL\);)!5(a?вa檏O*!3вGd~S v,B?Gd lٯĭքNHY6UclpIȘ^ȭ,;ie)N̠cV&1LN%0nbVn+X1>2jìb%_okl9q'%}2!BvHwm< q qھ(IJ/f$d*vG!^ *#ƳKQ%ğFJxrT_Q˫]K@+]dUD(Vk^ݵ]>]wcX_7v+/7a*^&kTA9u` (M2( BXN "I+$QHQu*Db?*1fMTUFN~DܮtdqIe,E# 6bd%i_&l)6  j\\rP-I!9']Nj*YĬB&U3}^^|$xݐ߅Ew:,KP $KnB~˶|TB~ˎ풃ifmN 5褦դ_زf0+KZqv9)FTVv:/0Xs ,?۟~T81Ie.ѾUm5}}i*L0yBLQ0@A`Ȋ!S8cinJ ׭+ʨp`}1ԮHqX eg)3O'+NZ?e>M|\Tyec@!MN[oVW;q`_ޮos2HTȻ5zj}NH嚅KߔWMn. D �*ӝ/+]zw;+oad?ߌrl3[گө^GJ$2%&C`, DsLqP4b[X{IW@'-ٖnNBKZRs:pPƐ"߽ 67#CYdjV4ﲏu}c{rFfӖI='f L)8ۙld!g'zR&WcWKe~#ҕ1G|aJ -1i*[ї̠;IM TI}E5,O9ɁjA1o{`_b9i{B9ARIJhR@8sΠz0UѦYwHo9T}u&p Q%Q5j7"w'0NFі1p9e@vϺ0o,Ns')XU܌D}Սѕrb@}H18)auZNvdvpc pcrf:0 ^j [ݤxi_͠&Yr~=.Y{]P?R /҇jGeG?-(eW=&f+Ʌ/@ .\yNmfQhEe3|QFOh7xj^bٙF #P刘t64&+9Vp041T^k A#z W*Wvy>S8m6?{ˈK$YZ"yحwCR:t$$5gM$`L|.\wp8l)q!́u8ez}7<lO3-p>(z1p$,˧bOεaTj!y^JA]V>qÕ߽\g7JR2X_*HhM)W)sO|n]Z\.^`tX|^ߧWazMd!WR(CgA-O5 [î??kNPW\J0*he:OQ1EwS{V>UVc(cqGeAdzǜKEV֒-j\lHSir \Nx :+i5"SoB~ڃ ]``M,(Ө QQE6HQc8z%A5ق9$h9[+K) IE_EcAxnܝ%PŮa2$"n!V$L6TDǷoi͐B12bl"@DFd;es|H|ٽ3S w$itI|rœT&Z1(%~,l~ɵKEr,Ĝ;)OU9whvVvĞey$Q\PB(1RX,s>7otN2yy~J&)Ai>c4_,"[U;V<gCKIϫb^`s#S(GzMif<ඕ`$P5ڛ<ڙ8(9Sp4?0[+r(}֖kv&t4hύIze> Q AOb8 29 ( Sg,g9ڎvmE=`k0A.8 ]ZQ /BDOaH@`] WBm4P.9(*u~,.K𐤵,JOY}^WV]a?wj{a|y3CDJN%$j0QVͶMM$wgFl8:J}  +GaG x Fh#߇uM@ظ:ዔzO,eEUD$,a'_鍗?RCdyߎ^M6jS$:SkXwD0VDz_yc69kX=N}z}u ٤& qιo`ɺq Ԇ8+5:T)%O1r0 d` P@dgzTױ_ٚǑ廽[حfq4J eׯ.)H@H}{+{ݿա&,w"dXWIţ*3^W됋הx7NNgZ)ѣc> 'NMhd:>;~Wտ1>AU`:ςv`eCV ..? ZT;Ґlx'"B ٞ2 ڞ.있YGyt~t[7-JZ2Gd5,ՄL:-bFnB6#LWd̏On$qU*SHsdZA8Hk:fK;Re $8ыAP)B7/O6ͿG+'->T)` EقKKV 4}r_~U ,f$*2>c` f!10iAD{ON"%0Tڿ㷥]J=-dd-((pb3o%+d6 Z?`?ߐ!{TGCKlB: ~]lϷ~ڔhP*,>2NTpà e肜^Łv6Kc6ZG'gBu hR3 1}6vה?R 磇qlf$4B3 b;m<{y!c#:[w Hgv1:7IiZO7*lX&ū (Qҟ4dO"&%&rE})LTUB}B>2 <^m2 ; ٠l,hU_Px5X]Mx"CkTCK'U.]7R'ח5M/P) Tad6^[n#`3J|yuƴ0E{oůS{]ia+^pA@(oMZ$-qH$e rI D>PD )u<@*<4y HIs|W…?cG_AGB-*m!dbvd>:$O8)a 72U6':0Yd<0_VGdPϙg$L:gXB"Ɉ,dN K b6ʘCw<Ƚх1hu⩴5{4'st,'K#Cyc%./DqH~RIs 5(MStf57|bEu,-)INr$J_Z$KVCNd|/1c%(A*?; ZGR)7uX\wr +FG^vp `sC5Y$<c{*{My^> Ǹ#j"fm0+輒<:9:,[u|>~ *oޔE ok"AK?M̢f'MB+TPkJJ &Kեɇ tєA@DँZ 2,V lɸH5veQ1#<:{ޣRǼ.8wZxiY 4Se'!#VB8s?ahT/f!^cMmȠAK9?Vw d Żfvd P$ >ִ < _Ƽ:{bԜٽGUJd IJ֊WYJ֋-*̊3 xn_~WUj9=m1]ǖ˜ܞ-?.2&qcZ M|/ϡt?Η2NC0/FAy'Uu0:<'$-'e>`GN "7 p$'>}PΚI/:_:}R+<1ƹ Y`B1[tn)N8ⷛ%>7ȸ}{5c8O~2\5d*qBWQ{9d\[Tbc|T|]``431} ^ IϿ$Ҝ7O#HkrE DXF=W!B[xUnq@id?T MÙ+jI I$ :[(:ZXu* 'M 8~iYHg^f!"lpޭ^w@>JqBJߑ*_jqLn(nz"32`IkI DTy⨁4OtlUJ9)3CXzȪŲ*n0NLQ>ô9'_RX TӂME"jXrVQO hߐMs>Wߎa\_때^G"iol|َ5+A"QRx?;0sk;q1GM\ 2A ; DRu>9In6RWsƴ<՟;LiE݈WL9;p]u)h̤UcoI9n(eHƈ`1nk&x]٩iq2&ŨE'm chct2?P$d=䳎ʌ`μ ;5 85k1Fd7 6b7a*l֐Ԣif%ӆCh\e[2r!}+L,S c׫D'0/O6Ϳb$|sJ2/@ Lm;5hZ&Xj;Tk>s_@gR2z16do1l,dhǫrkYGKE݂uy,(ُq/a,=̍ ﲄSV4c$of< D.X6Ahfj3-Ӕ1l+i_:+3y%|;BQ+9E&WvRm(OHl @,UFY\ʰM.-{O+d*tR)yH*5^)d*)4ڭDk-u|g Y [sνATJG9YN &SdWqQb9/(5A%Ac9NBQ~GWӧ<--Mjtm90qsu,Z>F! Ff 7@d6G<&Wwp#TM,U+T  #s,H7ڒ*1A#"v("7.PϬwGo,Ga2oG'4n<F(Tyn޽ɍF?HzƙƙQcrkoG˖mk>(ڰXةYT/|5+Ф:m5AW1C#D!A{: d,GiZRtB '\MضǾ{-{G,ST^T:$5 0E{O$m:y0vK}{gsJ駨a [==bYcڵ>d LmEq RalkkDV߸!,JR203l?~Һ Lch )[j 2q)[7Q>$]&3WC0OA m|NA⧋e9{Aq3=TsCZb(  N)x@M}؝R2xTdNaRG^}"`j[Aa՟Id_ĭ^%TϿb*E4W^fh#b4V[:-TkV8 ZJ`5k-6mwZk՛cY3-Ouzov^QuV'վ2{#T5L ۬͞kY]'M :)M(ZYEaDNs}~ShDZU<5!c،0ObKЏn3bi{_xe^Uюכh4_ [`E!PYZ/(]42ٽ;^iC*-Ǜ.6iOkiYD%U?,B3PKO0"a%>C˗2NC07EYϩgU^p A:Asr?Mry~Z#⧿Sb~1&!WS΍ӿV}JŪ/%V:WLg$cT>whHRfVFKRmRyYOݪlZ)ĚMxECS cX~ь%f+tRRtŲ}h16\wݳAV j·ϛO%v h? >/nLrҽѮ#=ioȕEh`M]@cLnl$"muYN\Y,gYnW|kz뾾75U(UEan^Zƥ+1EqWk/]`K.W/2z}p.CIv>W k+}dJ=؁Z*ʇ0?@ptQW>MșJlT%1:mQ&0WxsfMPyyMP/Ch0`эW翅zzCϨԫ)yW 7 ի^4WZ2gd&/'!}F6B|3ƁTJU/\*Vm:D2)I/9-h-F%KYvpz,-4-_~_+3vyrrB[\ hQO,Ү j3̫owxمLyR`ۂ򭓶YAK4-ym W^ L^wkN>y3}qu/Մ$ B?$H:ʋcFvdmJq{I|I ['uZ`ҜX{%ok~J[ _73{SK,v:$ 5"[BT i˯}1Jɩ:֩(GJ%Nռ68JSFN*}óug&Pf: ۿvyY$udn=5♗w^ ,X:s3*~EFK7uM6S%^L>K*د݋sC_yK3`fQ ;,a%PI0Fۮm&[5 ù<T LNӢK;OdDp$oY ٶGsf>^#_m„Ejm>u3.9}P:,ۋVl\Ni7.2 uֶno2hZ#qv?`بJX)tS˫`|he`E1Aݼ ")e` 8x3n0/Fw&#R`b%W!(?̣j9!5cg򩕫,ź9B$./̜\SIxiXNS߿{l:Ʊ wspWz%o}OaךD/62oYc^r_dbG ΍D41<̴hReMע1ULwLLl6g:҄ɴh!х폀&6!'ǯ=j!*}ՎtdLaaDblb[l&= &85-$?F1u>Bj?r~D#E1r2C2KfDDŽ[kDێcT J[?>< A$gU3%|c3n*>`eB$i$΁/]qd|'ɒm+Tgq YS {FiO >axjlyCe\.Є4nֵ[dt`?q{/Kb'|oa(04Ǐ$<<7xDSk-[t~[n&+,4 K8Wz= %kȽĽ?ǏE "vSSR+O zV|F5TG3cě%* mYEʹBJIYrx.6{yֳ+|Z1NZnLУJ)Z'pz0?_7;Lx7q~L[?v 2"mSFƱM2O|| b͌5bw lrz.l0\YsD_W=w-)ǝLJ*0j;7 ̓wӌ7`xb Ր>w9?WP4`EZIxYXuM/x81(c[ů~NGwRhOvv_G81$N juzMb1 R;%qP1mp8ATR a]WFɽ3z7q@6SOq2In(\FTLm$68\tԟFʋ9fߧ1oYO'.xICjo2,9S  xP+ZIL0*`J->e#6cF Y=f}knQ`O16Bh$WT.ZR`~)m\3; SFQ֗S6k*c8D[ PV Ʊq,Alq^Ln>9ݕ =yjsDŃUA_[8LNJy{RC KRfbZ<<ՋS8r~ zQo>Sjz)h}~ѶŤ|oy.j ›ZafSn5Ǜ㛙q[Ғcw\/׾ 0ԟp/ ǁi]vO e0^VԡʶWN9fdYnzKBnjbЀhZt}^ DA)!U/h JyqK,jPF ,BۺÉL?eVxkDk̸n 5k\rø] M`*|Uc2 ei"$|cyP]RkS1J$^G?7!؀p5'9WC XRv48!]尀<~ Ƒ<[[|t!4A!]q:USjd?Yg!n9eQNlFލ-`l'֡I*7?,g>=ek"&ҪԲeIMA E"8JV`N3h'0)2$Hs&D&ƒu1僨 cv4>|"5! x<]$(%=,۷pnXh<6%@l "42Q)XYq*`,Yd oJZe|~HY=e*`7XO@C1wiЬ. oa)r 2.K|Vԕ>J{<~zc]mv5n;XX;Wx[k+B{_{^#泚+XRT8.L{pR]FCi\(uyI o0wl4&Y*j2tux2,v1"\=8PHF\X2KvCi F(r@ ehtWkkCMxRWp\Z#LwӃQM;_pqh~q<cMj#­U$BI%gZw]//'ܛ]Yf}>'#`Ń m_\׆ŵ~).LcĤSF(۔ fұbkIӄ:B,q,%U|f^֟mq?\i)OET<='PA?^:o‘Ч~qM6w0!8J1/u@ݪ>7RR`sIA=o7hNV?SŝL}Ѭqk_ֵ"hd feVybQ](J{[qS ꎀ+P j䋕4eL4DFĠk em>zHwt5*SFrߗ[S}VBlF+#U GXí'g☙fBG" r_R^ D;~cTxK=pn|`7"!kcg| b@K9H N'm=xw<%8p-(Sj0,w'y@Ya7/jF:#ut qE"56R <}Yp! qpcUfԬ;bG`<hUHsGp&""r"FθHQIHE~טCE J>_BHYKMQܤL0LHjJ{"dLosy^X7iD`J:4H:6H2nHQI8h QB)`%ń3W KZ1=alyF'a7#09߿bIB,O獳ԪLs|h, pJNtchKu4oQv6w|՘#-J$s)id1Z"]Yoȵ+Dڵ/ ä{.brA.{0d7-VQ\$S erl]" 1{~72"K``9᜽ qWp=(􄮙rD -Ji8 `evYQ,aH]ꭍSVp@O\ޗ#._{6T3:TaSrexb+.FƞxNL.pGyAՂ%YkBNz3#]1k T,dSF*"U&Nⶼ_Jg+v>SjH?h9;azVxN Î{>smŐ&24(LmbqV Oھہ1XЩrb/:z-{Pj#x^=ƙR cVZ),w/:1}2j  #S{nIEU0J"…F"|blllͶUjw^IE:۹npXUǯa]x#{ƿ@ CJv8婾~*`nQN)Qΐ=J?36t2 +l1I Ϊ1MQ yN`-l=t+&b>5lx{LbrKF6Y)i7Yt9M> Q{E!NOdZ'ٞ*!"4CF8l&2ܳgw`bqb(}qT4 'Y1y|0zX~Jud=9&ep)P? ^>$ff]櫫%ŰHcD |#rS5:@9Tr N4i@딋[!qѽZ3.Y:u~~ (-7vRSC ,s`]Ϸ) ^vor\.E\' |]N՟W p|Pbu;ⱼl۝eR J]haģOSDik $qg{y|9KW XN>fd0[ uDI@>xV&x.r?^j4ϋEvݏj6 k&*ܤ]~-UItIʶ|D=񰿞d6@LNߌ}~[fpqi9V .S;T;v WɊQ m;gU.?~kVݡ2ige 9[ [ ~u˱iqaI@r7ʂU~C0AZpBޭo^Fp_Ɩ?N6OU^ R=߬C vRa3.^b+Dm>11&c3+S0:,bB.V} 'N< og ï 7:>pSrp"c]'|z7o.&<%7E^Y8ǿŊH-jYEF=7@+ t#)aؗւ[U2B\4 I7yO&9p`"Ϧo%2!TP`÷%5sHcm$4q9p4C"g|[ZC9_()<)Bf_M a1N ˡ xxޏVeKGp]%2OvY#ZO3'4(`a u%xS OXƠ{[zgfx<֡)u:N=艮?[ jyPd; ό$# ik`7mqX&Opy =F ,Ig*'j#uzxv:K.Y@ߢ<h Jg׏'6}ۂry֟Xý<ŏ; B1 .rCf@ˊ0! T$ d&poÅh@pU t=>1 S*n!,>O@ %ɒB[D@@.6Or# su"GgG,iȳ53ݕ詋;kyoy)Śc\gDOJIv,o3}*BXnx<Б(ٝ3tO Ixnָe~ށQcV-\*]PLJ,c‰E-o&Z~? ˬ?3a8} c3[f4ri_|ݮi$3ݡ6A^7A6%%]1D¶KcD @cv¦B@ꮾxqZd+KyNuMCdR#q>LNk%Eo-LXӠ+.v^ !gKu3P;@ {wkM2ARTLV!jx/xJ~oB ""L*QKxBbSm7x=I`FO|I6&co$?U.':Pp~5.yC{},w*d D#H ]8*û*%AQg*ʎ؉I)vŨ YpCBƠ=wX:zU%bW(5\[NU1ĬeAj7E;x1}1cW+dՄ}NsҔcy:fcUBn3 8"V(uZ|9zVzͧ3ry/~V2OӁm'.zPkh=k]0V";MDT9&<bm'xőPHIF)a~״.n!&o>ƗQ^p$Ĝ+͌LWimRȗG;Dul全]<>F;v3p) }sc}OVA)+{ch[c9`Rvb=!Gw4P^ӤPw*pkpuVLu#NcgcƭШ$fm;GsCcr֨]qQ*d\Rmockʚ7k  wMacw[8 ysgo(0fVl﷯ )_y;]|g0[EOLBh&SL}OV8ћx J0[q1X$3A~H8bL$\˾ͯeW6pkq zZD]08`QYIK+A12%b߲Ӑ-F‘vb֛zD<9qM/>O&X[Y?"?|O7R["~6/ o=k{/"SK_U!謘D)ŧ0f\JVIxgɚSH̶RyKלSHᧃ0EPg] XifT#`D(7:WxY}գq6@7;u?_Xdz.5 gpx1@s{TL`xZptS8:B@EO ٥GbJ#V7|< ZX-MS1jXg|q97t)t^G*4w Y~;q~fмq]:<-?Quˡ!S.Ģʼn˝5ryIDmKN ˤ7jV*j[Db`/r+$L㗤IL#_ڮc?'wZqMw ׀ z5w;M=#u#PoTm3GȮU&k3g,Uq(UBSF[[xFVx)lb,<~̧z9!ֻc^yx])kn=kST'}C $S1^θd8ь'zHLjDLp$h"q*bbh4*sθn*QuSǯDg:T*5AA($JAY\^$pȔɒ & k*4F:> Rœc?1խWnCLktZra&q 3NeTO2%ImM_ g  |{byTzY49r H}6}*u"L(0c<<`i#LDDraJRg3G|v`V^v8R`sb^f썥sNӗ+r6FLTrnL' &k!^>EV!Rf lϙ@N:2k Ww3~+d=R7XL\~.1\U.Iѥ sVI2( _$i*4iiBy"`1ܜ77y՛iWֵ34XlL-ԫBpJtZD'B f\PF;3_˶<)/&\,Qq8}cYsO0z $BE#a'E&e,B%&C>SIwX-1.B Ymxs{m|YNAΣBJ)7HDg˅+4¥0˵&_b_XJr-_ѧA Z'w!,3)kDt]+d\β/'㎓ !zО-bHH񎏖[f52.U/+2Yexg3ܢoP$uTL2q٨7ze2tS {⮕|1ASԙB zTȗ݀U9w_!cFw&}~;^FJ H]N 'ľ=Tɷ6~D3譵:bO7h*8)=4N!4C**hiEuƨ]G=#yך'T? ݵiwikz(%c ,,ƒ0%H79N5 pPޘ&w.9L?{WǑO]$X}Jr $p_v!ԫXXds~dF#gfzd+v lkzX,yXExٮ_o7}Wqn~__pk+y[DƼʄ?W>y{;Otʹs|i& ]=Ԏ?OFy {kz퓯EO40R%`7 ͇2J QEb.9@߳d*3iB>D t4'hj3 A2ĭuV/t4x&S[F[ c |1$Cv.g ;" Y T,|Ԙ-edHEO?>qNFY_1X /,1M21a2'_(NYB|2vxa  5gJd[q\0t0??vs{luTg gcK |6!m'(HZ1XzWa\)a.Rŏme{HŷBr<ɑFYz6"g&?1㮝ݸvES^vW-ںBX=x՛ӓoŃ=Ӌ7!jS2O#T4%OXkQ5 @f4Qv``Q1b[%ԽFW!ޛjh 3]9,팓"^0ofWx&]QP)<_95n@WtW{, qjs4܇xp:z0Q}f?Gn]&/#u: I3j;0ʡ)X轢+m9%7M=R4DTQq>4 r4jɯy ڽ3e ]M0ܲrfb,:OoAޏ;V8b{Q).<!Jy\g+Dʼn{\~(d(ePlRƍyڝ1y f\b ȝcb#> ZiIS'i :8W\& pČ)|Ai ~6Z̬w!>I?4T`b\{diYbбhM4@+;D.#ꥧTnך+܌c*_Б0_>S|%QfR1 (#cV`njzMc;|ܬ! OJcXh-cn"{6QkaVΙX'xq[D^Fs6d.."3H)g.@!%WהZ̊OB22Қ#9[DᮍN*{8}qr>\umѰJܧT #9\McDR%< G])q(J|y>png%1C Ǻ:s z Oo}@8B6R&%em~Ё|0G";p ^}qYM33x5rt3@1$}a`BXz Q #aFր!;H]P(GL7^l[@і!ǘ?_Mfx\wIQ$^HiNN u/BDC-sds͛;!Hf8Y9y,EN7d9)*SʽqI)"[r uc6z@:ZBJ=%ZP[,5fV!F/L`ӯfh$l0pdI S0PvcGh#*нv{O3 )6:i̜爟.~V vL>bec*%s .lL 8KǣZ[wvj@cߟ>y!s)K2!9UuD41Kͱ}'SN]Z/Eh:R( ե=ˠlvZ292`ia[mhgݼ`~.2F3Ld a8b݇dlV;l,G. .GM&6K(tu{UrQ ;4NŔ84SNݥDH槓Y2L##H| Ev~5vw.2Tgz#}vk[G`pXWnn{΁xsv$)mYc=_?f=Gwzv-OP"yۢlRarv9žp,svSm -n Ee|` EgI3)fbLƘ6 ):De- K1CjՏ|%H K[՝>hV^܇z\΃\dN{|F<؛D,'rk9uQ3~c\RpN,ݗ3ϼ9KsYɵ\<߂@v SrhMRR gDZ#Vm-Y}Wr1zYVͫs[/ʿ.~IU˛WueY.~peIK_Esjڹz;2 )]6հ2<ɜ\Ɇ{S02sSӓ=;g;={ݿRl?y®/x.焼m=B|v~=$垫ҖG aaݓ^bXחT"DdxF%İ`(F%Q˾g/}6bž#i(G*KnRhts1P|RQ  `nV=F2Ab$3~F9ekMpϟF泉sn 1D[Ir=G:nrW i{wg8I>zcnny: !,[ajM$^lbpx˖>²ly|xG)|[oqjok|>P@AtP ELp?($y*1 (BԮ:$a>CZǩ<[3vr'}>Nb^,8r;~\\?<`6mmG%r%E.oѸB 㵓_5c=hTeIh[هL:w<(KÁïe?i&geio]J^yENkS|si[+.ay|䤷;I.39vIo33C`.0Q}Sż>>Ԍl 6c1ыg{ctqD@s|{%Q RyHHh.cﯢ>yүO}D\ۈ i:]Izef;RNDD:Iq/rOW*q?)?T^VnloMB>#l[L0(P'Jqe@3vGfUUk ,[;afv7:Xٓ~b>t/YWpY 0 MӪYQ͚xLM܃{X+Y@dӄ=C;K$ j3jkhLCatO>$dCdddu3$S<t3>}ѡ%nD]sZ1kQM-hYP]> '֫I}=o~=p3 yx;p7>~Fg@ax;pVl_)|JкnEl?PeΜg;.3SɎډw+9q<~VzuT˳b=£㞏75+o 8CMh|yFRj:9F7cL! ,\mX+.w1401Q*kpߡ8G[+M+})Tk9D 4~LfFњx olrtް/4Y2^#L~ &kuZ}d-qncn9*YG=k?'ޫǾQg) ʇ .s/y9(d\Nq@nSP&!F"旷mE p1 ɩyi1[i{B%޲kѷ B(pч6[*`w;*(dIQ:sϱKT U:%vUgNxlkϮS$ %'r1@?/_}˫I ]88<{l\w٢9V A "bѱ;ʗQO<\LAZSsfGΣWUB3a om沿ۍ1ΐ 5ol-t3 X(o 1gS"/Ŝjk @O h~(N{. s8c:n+԰{YD9_;,9;Kl3o+8_l6C[)6!SJP0RQ"l\eIa[n r+$)wS/%Vw)8YO"|cv}` 3y'%5EǛ̧I?DDUdrjz)˩ N LyyV{|T򰚓6ɐd2F 4 ^fԨ|^иz*֥&3QP 喆ˍR> `۞X^I£\U"fiJWx ;[Q?[TJz\ݙ:XtپYCkgz߁bLwdnJp$W.GED{PyJ4Ia+cͺE-FJtD^vwѠ9Wv㻒~fb(1p Wx6muut?U*BᔅZ- s 4YFϝKyY Xgz x j@5Jd6m0)zqYN[ rTxк/}VH{3Zm͒ nv:6<)YTҚdhg g]mw7S#\8VV޽If3 [K_qxuRՋ|6V%Ԛ-ֶF ΝBgʖdpiR~xVb>$Nm?B,t֨bx.fQPZ|1J.KmvT!:yg3 ߩ0V9( tR!iR8Ͽܓpg &̦:z+'7z.@/хH;ufI*԰MfEQggC`$fx_[239҃`QG%{٬@ w9gXY(.4*,*hmc ۭC^e!/:TxRBvXxCVzxn8Wb>L6xkyD5q&}|n?,)&* C,(Ӛ8߽RwgI4$pOC@| JG8bj]W P|-Lۿh?U \hÎI{9 ԝd(cr)u[ bK18H˚](+rU 3O` #9 & z5GuCF`#|Yŷ6L3ݏ[VRf^w~(jѥ`wIIh{knLcG.W!psuc`tO4GS;V?E tQfhpov%X!sG8 IZІ`{{I=e v75`|JK^ºZ֝mz.]q³1ۊ:ȳ;*"|MGemt1»ߎʶ/>EOz1k^ (h89@$):s-ttΞ[ D W@Jn+iBg j>K"lP4c@'"")b$- b*f5E(NAǯlGa`Il(HAH'ڨ!neHvx 9A/1m֒n+ طdVyBSb U0xҚeI mqvOzWOZXhxr0.U%-%r߬/Jϫ_OBY=C4RVꜿp%I tN]胚>]A7:QI6:z  ntt ^Y&aNU^p7׼G\ )To%pψ[Ow7N~|1!5@" GIT<b L_Аvv^7>q%pa̜W dĢ}>2]sXj :Ol[|p6ho3vX{f.A*flnxSA/aޘl`DnKJߠEovÇw5s[;)$ 6h`Am0MiR 7СFҜMLSms c15=*X:UcPL-vPNIK PɫuƎE$L;+b7'GCٯ׼ !ΓgjZŝ$-@jЧUp8簜n[\&=&Mi$dRKعͱ_f+/Q]zo֥Ke;x4ne;=l%}SgUS/V[>Z^PR'$s| ~9fH]\"@}$5||j/Jff+,8BVo F5QqO27h(~޾/8WpY CYL99cM2cn[4r rS 1QlZy֧8>yz)ZCcL'Ӎѭ$2阜6꠶ wǷ&S]EW?|ǿ#]_fB Nw0fnQ6nս{?Ꙗgzpo ?/_}˫1;eTJl\wL 0'j69c<0 R,6YՑVk:zȌϐp' vEٷZPs9rne{?{׶U׭dţ_c<؉OK,3kT9&} q ߽-Z3vmŻS'm;-nwRl xå<3m`tS.Xی!B}k1R oͰj1 2j.w*@( ^HNu^KrwΠ)q%٨>soSc 3]F#>0hM9>N:\"fJϊaΡb;HIC3 VX߂ |}~55fʃW:ِlx6v{r$Hx^PqfC5e`BPh|u!"K5lM :r5i-6VIR)W?&D)-qRruV L_,H[t  Yf냑 W<-7_: _t0Z3&.G ~h p^@Cc>\Rmҽ! U u?qRy|Qr>;a/&[0чL 8/^?7<< jg>Pf^@JsO+l+J n$ gBl E[=BH&'ﱯrܝ![[ -|h HBZj쎅?15Z'ط`B٬kmI:6ro Yef7z7:SqY[p9-!zq:h+RVɞ 79?LEYmYD; !k[ufq:B=%~cFo߾~NIl鲛= y3"etcQѝ$%RISkcּh>5^zI1=Jmc[ia[dPBۧ/EC(]LlDאgm@!Q,Z1<=G4$::GU)K)!QpѶvIDi}N4lZwg6FI g0K@4ݤ޽]ő=7V*N?}>bkfoYuJ!"6Cav3F 3gg#B2xHf8R:H\;7S"_y6DV<$i|kWASmNAUP/8@FS 0 sgD] XDj7du3˘Q0~CQX ;M97_:+>+ $;c0x28A!z;ͺӊo{ୱ@35"8qh_KRϝRX(|te m5gl7a+ qElL2ΣmE(N2U#}뙃jGV,(=(Wz* *RMט(Jһ* i+ YBmu^;ś|tO@>S<jmу=LgS<^KrwΙ$!#lYnc8D71gq(8.O =U{,z9ݒ=&!n묿ýo;.Kc% 4;Jǀ1XK#l 8MbBW*d`(̎vlcFݼ椅x=Lك!^0Ŗ #qH -`7H4OǖP3Vv-eiyeض?pȚ֦~݉91JÑ+-KQp A]beSm o)=Z.F#QZy:n˨/Zʷ3w%7>C*<"i5~v{zHB2Ɨpxʾ桏 Vh]:<BvHvqvߚ924fBٽ dwyK!I"SbzQN;7FK6mQWd"æp=dgi2slRf5sgϛkM^Z¶Ӑ gZȑ" maxwp ,{$y,bKVn%YdEV~U,VYIq?b' t^QbTjâJ@HMqc1ZQ kB+5B{:礐肎s @8#@ :ڬ n%$_8h%"PU-JS ]@Y[;vsiDҖHfhô=/K9A_`d:OWNܐ93ڈ{%K*8U^Ec$") \EJbFΩ)Jbڱh1 |G: y+YPپB>)N#ƀI+,QSp$zpJS>֒-8tL=b$X(AI ..ztgk(=e1pFO*F1 `E̔'3fށ6IMjO%WҮAd$\qAZZL=֧`t =W¿ը\"QXh4[ASZl#rcDiSN婈Q^Igdz?RhXe1Fw\h $E *$fFYC">X _#x^A`UtZʈ|>i#P@qz rbT L".ǔfL) 'XZqk)-(kQ؈*«o^-&L?n۬^_ܔUT/:ZTAʚ'&I\+#ji@wr"ִX 6^*WM6=7Xй<[x'jehc ʙ u$, B<[8pk睊H%,\.  9_ l0n+=ΧB~~̦bgpqyWiڽ7JzQUfOf<~3`h#B4fVdռ9_ǼTXKf;Lj7* |?87T\&9ck^p a\iRt%% bKJο3udTL8ؔwZrF`dYyQ; ŜBh&oo0XlIl'a8cʛ/>){ڕؓ\D#@š3r#J9T ^#ڭBOk[[N*0[vH}/wDaT ]S[QQ>-i5:{GoirI95ǀ*0WsC=Q~c퀍Cv,uU:cmڽ ڽ-vN3{8PH*jKJiv7ݞC;`#D qZg&2Cv3GXJarNfi;΍៫9TI͐$fX.ù1!A%Tʷ:1|k+i+BuWF(Na7wfdo{G|RjfxƷtݠ70a~^}ʉ;X0N#;{NڝO"Ld4.j䧦^NI>zJR.j?°{8@f"Ro 5oa7<&* AJxd$"*P3ZݵlaťɖYz*o̙{"XTCElQfS ~)_chCjƨA%^eWM$ARՃ_5J̻*2 ɑTꋩHU..ݓ \wn\$^cwkI,xB{+ )}/?:{k׀`GN{<;ƚe)Ʋ4~ =C[ $Ee\Xl#գUpoi|`l XhmX=õ̵1L/`]z, 0mgt a kC=ўaĶ ^*Z1 pρPC5 9H\1yfn3MPxKtd9 #a`( 9r}y ^"Џ:/ Uc2)UtJ|aԈQ}}bc94*D)Ic),6bϱTe/2C,4b(#ʸc)i-0s`Y5yg%s}#₆m{FB72<0Xݚc!zoĂ %TtJa  :!"JM7oZӒk -\S0k$ZAw-`? y6Ӭљ0=ey#YZ 3;చ>l3 G00SFg u1rͲ .4Qΐl9 )uA WyGyGv&m+7Zƫ^%wӺ"%(SAN8":4xFZxח@tathq _ɋ |vUO6q'x-  2Qd  _.G:5J=-zEl41a>EѷKА昦|GaiŚ:ט4*;ZW[ tj0D# T!+%fNՐFg͸}L/1fIZ/ lϳH[UQ)8%8*VCh7I!*lI !&|Q}^ qɝ+}(,͗ s+1bFS|Dc/( -FAB_0z_l%w-%p#jgMyC-h' Õ AғI˒j@-?f>NcQRSeNI;/{ ߕCSH:D"Ex 4gh!(}1գ_zɓd ܍vO ;o~#0^xiGh,֯0Dߛ__~t4mz0V .; 4wޖg2׷ 0%*Y -9q玧t9|ot5?ң4,2O~WҳZa.w=ov𜉈 Sitd)u BF /(ruwV:llp] 껐݈QK'و9 ,[ΞEaIr;pDM4 aFe5zNc+jK-uZ*V韸eR]19\LP+Ô@c(<بw$o]م?dsc%o ~&(uH[).%5 dP"X[XSli"ԛ_ Eyn6fNgs<7V~fG9Pj5a0 XbnR=&XN6^b;H1'ϝ0GXr˨>);xk9P*~%6'mr>>^FLG#U<<0%zf ՠife4o ORz1Dt[C"]YuS) onƨGpbBkקշ() ,I_mW eZo^!!Y`4iǂJ{?ikMy64 Ig{Eie4{˞9par*IS~H'!rw ÷mPC&fC 85\KB-"F!mhS%]˨J"y[%՗tOOGtbw 0r%^pB7 =g%JSi*MM$K`eڡ̡oJ1`ZmӇ?OӭZB]{wu}kl2g1%dȏi?u-nqN{;pB$xU^9.VF?`jbFkƼ{u"Z-s_5s1@,T$f{(%-7D$: A"{n "5?a1eC 1Fcm+%RL~`>bo] I5F5.(*%i)*8 SUTjG.!Z8_D5PSB̋ R^`GYX簔,Bz _{wNqCk:ߋVÃ/(Ɖ$f cW\=3:oXV5[XUR7|4)h{ŷ0M=wZӻ;3~gJo^tJ&+o8gg7twʗKK}-}ӷK37S:sX3]1!Iuˌ_X*Wp%)EF_ ٯS*NTX֣+d%a5Τ#r}zR8S_=&K Xr,dHLRjURBK.('"rNQV7 :h &i,6|ӹO|I#`Id{{K4Zd:Ci:iits;­ "*VKQ,g>}Y_y sQ9"gDgg3IsUN\E;sj(<5J8OSj+<㰊+/+[A5>1a?{֍J/G­рakƛڪTyI,y$q R)Q!xHC]xnkm\f+Z_TqW9[\N%B/x_*Kta<|?ɏ=4שwnN޼?,_L˻"\WXȧyqgpL=YߍVt)'ecʡ7{lNQ1!%mّ+nH9pkKΧxgLs16u0{[ Hjgc 3PQo% nL..6`d'GٟߏfY;DϬ/z5 ~͸Սη^+C2n =]sBI!BI! -:!eKi.ւcE p&;ϕS< 9W/j M[_vc3KlIIlI?/!+o Hk$OX=KR> եI2TEMo}M( lo}R|~ԩ7]%z>4^+0i4 mx3Ѻ.&RfA Wv"X\/(y$қ_,^hhQ𬅢]_~֗U h;aQ4(`{̐c &7w1A,gloyIi=) '`I[f@ 5cQH~eB ,R`hΖ #h#0 {% 6<|X-A0*PX~}])};.ث!pץ!,G$RSXGA=v>EtuLRnd^'-!ZPU%!KG@jw\؋Ԧ*1xUZ?/Ϟ@}K~ƫH-I, ɠ{-R/i/25GN ?X$'t xʓONzͬ?= ݍ1FpB |4e-6|mJiӭxWV=~p=,5@)'p^}s )O%ij]ŷZԏ ʹq9cUg9cë_(g|̍C$'dt$ɭ+ŶŶ6{r286Oo]1q~Ee*i-:PLnVW^6~ƜS#5MyT$lZ(1%Z~ˇCb#U`{&>~h/L?ӛw^I֟~sw=s.{9m^iu7r&҇?Yf&QH~}}C٧9^5׾\$r'h:oLU4FnRnw\k?*qS ~j2Ggcf YקGWM.s}(MU3~aVN|[&aGY, oKL,c첱ЩT &}% W*^R{1FBxD}3NLkg&\-J8ذR7IjK4X$ 4Tʁw:P7 ܗhu)n4P]t4+;<[,Ydit )#@Gɖ#`*E[T& pV+I6v% IRV>EL e0NI(P7HJ{c&f XRJaNby~ZľYnvtvQwKͳk2f٬;re/MwYp"h$} OXB"+E{ K*YJyБ @vW&.Z*螴`{TʒP0g+$ڨ(̐Q${]X_0؂~0 h7c:ئwRsd}p5ƤlYr)-B9x fL.}ϒ{ɨG%Fz|97(<ˈb6 .Ԗ6`KU T֙T)gc6ZV0M5<y 8:hU;E12d<1nk*hs,zOͨ=!_0֨ǰ[5JF&֨Az@!5j-57}^hh|2'^$,c\O z[t0xZ*`,/m-X z.O3grt1`stnR^~C0voōhb3I[p-{$Q4 p¸'vTȰ&r\_xݡ!#`f#~0zg {<~.ݐ6 )g' ;at(h ٯ_)U[ o5¿,̒ ̻Ƽ)2%%2vFY7QmӽoINOs/gӷJ;A}Cn@"_0o mDe -|ɹiJT7l+id3Vg0nxPTiu_M9{e3)?l\X9WSaEV%1}#%E'}psO7〿D[00Ҫ{0bA7YgĒ՜;_綱>w cje-VZ+$i#`Xz鱊pG9!q{6ks{!iŎSB23"i/:vOJ?)~sFKt3ډi9FYH<$>܄L2fQ Ȁe6ob:z}uu9 6G/tpQ^fΣw-Yl~ڪ\.19Z+εnyv7/wC`ʂl1A@Di/0dZCiщ|<0yF-s@T% ߆ah7aFԂFj$@pxR[W*Mf(Vu }r1(C,is @1V3+OYT*k&ųZ&R)2cN1ǂdbV[u% j EۿlY5Kk1gUmJ2RY6t1$#a8ϜQhQDhL,"] e Sʢ1`LlJrE!49^$b:$FEihNc)U9uµRmD}X3$UH%ꩄw7]!<>yfR󗽤wbb[ksz!^=;9v5Ѧgn[ ݶzV?`+ Yk6؍Tz)Bo&l @$ [2(|\^ݲcc}}olZl6M`G#!57=$j%L Rlz.{3,2]5|-dM>V`˛.pue vy)o t)blBe1(p㘊O~FaܕKR_c̃As-zHMx uRT[jjJM\Cj:ؕJuL$ ? rK-HPOQڃ94 ^zL3|%i9 ~\ͳk\/(DͤLZVUڢ$sV V$Iuvxz9b\vL 'P98H;>U o蟹MMMMJׯ=; KgAf#NK!IA@PZ;vGA+: ꪚ5Uմ|mM@pv@4 +zp3;Dl8z/EZ[yMiyJ 8Fe>>^{HZ<*ֱv&9Hl`;`g H.klbn2';'JOk  F2![o H|b ށb^K*fxc[srZ ǪѿH7޵q+Eȇ´~ȇ48M Q-Y$;Nwzx%K֮v%+6rp8?Dc/0^.`w6z'2rZ&YSHHu 5T !*^H_d)n%ׯ6}VBpK`w5o\\1ݮd#Y.Pr.N%lѧCA2ce&#a¨ ,U*X΃2~ #.hI`0*ke1D=*# LFl0"aYf13/AK oZ8)+7`<@R(ʌG^~u؊vV tDD)d4q"42ri$2.V+Ө<;2X`kP ,9 4.%N3 S,|`1NӠ2V[Jm2TɀMb22ݘӱ. 䦙mي–uz#<&m0{"[FKe%iO/T,!@ppkm>\]̂;F!HIW|s)Kl:\3 [~ !alz>] ?j*wܒ^/ݏJBav[wuCbLp|zNVSh/:%F2a}]宐]!PN>`ֹ@Fvu{W7WVz,nYdaWf,X(mD2kH4H0AX!#aNE( ++\-L,aQrSL0JڋaǏƙ|?s 231=f+Fds><[6"ggY(,T:z}߹HPXI-T@aslV  K{Õ/ŭxg|PRL9N Y.^A #s Zm'x EN/Bx;K5*WK~xp[%[zLZV2؆beF^cN=!q}01ASl5 /QBqQf?uvII4k($^`@­X2yبHQo0oXLAm"za5?uaX腏r^hy腇0. c|$T4#;^m)J=~YlTG sEIHц;y`)W@Ey`z v0Hqib?A9"!3BO0yg=hv즅.N @:fw$tӬ|C&}>|,J3\k*^!cCLcA աAWn[JHj?$5[1j.b1Fjg)VOb.6[ OSG1YQI.'#'Do?SrCnr!96{Vcv`NnxpM}44/U?Z0C ۛU;бJ*!ծ֒ŅB{o0Q Ź?H'6vhiw=hhMJ[#tt4$u4kƈڧTwZLj3EӇtQSGS"PVr)X4ݏCAY'ԎG\0HG 4=j ڰH !mqa1R BܦEoST?`fKm LS)13Y9&;QF#5 $x̱#3chvl9OERy $yywԨuZv J)&0r`ҫ4+E״䯯~~k?z>FȐIԼF]Z};1ٴ`+XdXٍu;lZY"Iuٴ홏WŰj $TjcP}))/K7_ 'pB ]ל;Dj36MeqJInA)%(V9H|2<hCy|4ٷB:Yu߅U}{^p"v'fx{ ݔlb8ގhkg uBmSաӋKոӋÉPT_fR_%LԹ f/Y z9~ $'}{nUYR:N%>b @OXě3h$=w;SEЧty$5ҍ//)+^}y`0; ;7 /bAjhҫ?1QկeјQ( 97^t^NÄ7LU8WPe)m_ /+zp / ?|((^l묿1zs)Owc~r%F׃"nj:g,JoN=.&34OjkIPc⼘iqxǃ 3P@<!-94Y$ w3d"1jLB:ڔ-'1(Z姹NZ`W)xJsX}#:\|li;K|Oi7psee{ePaGsyN`F1dHs3̭q-,V[J \Hn;W5{ʕy%x@n)aKLC "4 VX -E|/ǩl3Pzst'K:NaB%)La;^tIKh(xeD|}B:븺ތL]EDnR8!)R3lIȨ.,Hvkhdp BXeԛ+HmOR4ڰUv+S8 0w H*7:v[2S/oWo{fZgYSe.%oRU_!%iXB]LCDICDICDICDE ql)X"t*ŝ[ >e؀2elĜؔ I;x!D)bZITG)`k2uX.eH#n9Մ'.̜& =A:^3%RkᣊPy>4*%;hq8d v _|KTvZmj`#AQm"Vl8""&u3Ee6Z."(A*D"pC搂 %ЈC3l$jbKnc.ԬF(J$KXH lKMnKcIJQZѣGiE+2tPtrx2X3$8bP@2 :AfXz@;6\-u Ryֵ-ԢdLktZgo:<=d,Op=z)*oY3ut.P& I cQ y%fAI2ZBrLh`R*C2ey64o)͎&b̗x)m~{vOX3XW2?sǦ{ &HX31Qk!Xx^XxD A1%C%DzUj®& _4cbK+lyЇ3MǢcqB0)s|amA.*GRf3V;I$BJ5TY!5̇ ,|zJ !X$_R+JCQGJc}$Xwh6lTZ'RI1#KcD%k=/ SR_E`"WuhHȨD5x9-v3aT`\ >Q+ J1*R=5s2ܝ@1Xmff+Z&FHfj0R!idm5tPhs1FqhVBsUf4Qz%X:'g\L={pH+pp. N2 P( 1!R(!\hѷk=y8`>Bcl,SF/v X@Z+10>hN`6i$<|X&7!!°DţаCoQE*=w A? @pp-daUa!Ďx2[0) /]p-1'SW5rFԓL8PcQCR륲NQjૅ.*X* 7NG6ܓE,c=)5@*vp<wH8B 7ҬфÊ^SN pnY-8pͫ'쟯X.7ϳbTΦ4Q$a9UO/:֩31/1e'|i28V%4~) k[$)XkKc"0Wq֚M5\_u8Iz?MQxrZ0z&$)O6{haֲ\߽{?Z -5\^ȼJzZ7cmCyZ`;dhHT=ReAj`?^@SoD-.Cӌ_c SKh`$4R_aydy /mC9kͣi*8wdm)_gpr1,'WduHrξ0TceWs*V,7mz j`M- |Ը3] tOS#I*UpxGgLukq$%v>jk!T۵y_ Gg~N Gi OO 1Ek Z[<^iaoD]}KjFnT_?4,ګzhXsq=tjFJR#S&xºSaGwCXEE@1n>w5`15>NqJ9b-h?zjEDϽ[,SJ/&AJQ U2FR5ALxvة? @BtҥMSeg 5J> KW\,o?U>ls. gOo^.痋|󠟨Ugg8aCLF&|򅧡_tu=a(U5H n]$}Cs>/hVŏAgw ?%,k~|>Xc>Xc=,7s > LcLhSRƜFBZ` @ %.1.r8{Ce^U}ه\2W= ̿gV~-~:t$uU6S|b:D 04GZ13`{U2EFQF4EI:}:EqVi}l&E2[Z u⅜IBp4wg cGH>/r`~Cy&P#)ӚG%6R&$ŀqel`8u?rKR92$rm^(o5x)DO2("@=1(`L̥.)WPel^Qېx+|go?3^Ϳ*ɫZ!KG|xo^j řY$wC pWM__̯~e9ٗ٥zUY`FΉCf yIpqySyg0Q z%8DDi2e3ռ ͙R0RD<1flV$S>egk)ɧ}"?XdLq-rp >-?'$Ƙڎ1I/pcII%܎L"=iI#|z=M}I0R:zhmNf!LTKPq`= ܗgSi-'pe٩qz6˼DSy~a[s(70]h(K(cQg,Ka%M'svenja>3e­[MLwQ0V4Y'Rʏ`V G ﹤H2YXZls<ttW|vv8SYHhN'mЈi%J:Z $L`෯:cR/T|4g+)kZE\ 9^BhU[m8²M]ָ^=2˟?M^.ϺgNȀSC,:~#pP `BPjqaK8<RTT@1K꟠C S@[dcK$m!=zxTRTh00.V-LcXƩv)V1j П˛jQZճ>a5SQpW|~b[8&㣿Xx+I֒:wwWnaԡ=ŢBZ"4ߞju[qHӨ.dif+g4x-^&,)D3L?}cO#%gKrC?&`o1߫Ҹyij!ӣQ nST.?^ϻwk.RryuqwG\߭&CM9Gfz;bϏkx+ӿV*_VuK/ܪfqC4SVaLq[bvBv 6 ݲn} hjgʐn:N3R i-ڻe @7,LJ%yVv{SLzV8A+`^z^xV[0D; 7 K t5CRm Ƥ#EJX-AB &qf|UZi2"FFC1#l _bK3RֿzVd/UBqf`J ,&x7x>W1}Fw;!;jlM%ׯw}UO.闟q2Wj)4:5K)vQL\M.h`-̖Jnl} lQ+z(h1Z>#jɩWJ{$fy%=cO>P AW_\gJ99Ff3B ݙ)Ȳ<{Wܸ|̢g*#cwe;һ)*I\nSW+8"RKم$/o=euؕo qw/ooF\se@4__{6E" R,86He&:g =L^6++>A+h bt?2Gdd(,Grp"jZE"),7ڇ@3#|6%$Ch@2dgԤ(eb"%BxdjH {Yq\% @$HVIfY^0R%`EfqJg$ )%,\PƔV/ȅe-z TW{*5A Z.E2BQ^ޖx3B)Zy jfd:C b:aL;+AگStmiT!EBD1ZlJG+.W\(4$6J^} DE%s0ȗt&IH-V!+BAUk1CSzܠl+9Mr!r It"@PgDyc8_RVYҳ+׊~wa ]כK[zyқ͵lb-(S:n"|@HVEw1 J}==S/ͻ746BkV4RPP$ @JTb 4U)][5~yGf⍜y3kj``L&5p)6m7I_/E{dNtS3хSpeKe 㥑'h yBqF\ 1h$M/!L"9̢ʫ9Jaj:ɉ$ ?FiF ff:m`WWRiv'F񂼠9T2Cmĥ?eՏxY6kKr-R$wUU-Z4mNĠg;2'cOYεכv:hYNH9`s '+Y.S/L.gS9wTDo0 K|E/@D@rX6}=9!?"JY t2M:2H8!I@DJs!E uyMahnO#BьӜC8Xeh$qҝ^$Jׄ_$¨<H>e,F+V=LRiH,$+ 8A9(v('9!i Q)Yd< R2!b1 f93@s&ע(qEsIMҸ&Vw&ҡSsf̣K4`:CI/-ҜKz1md$ų2_ꂢ`R>J~I6sq 'aG%ɇO`>-RRTҗE*% fI]g0tI}R}mh V |!\<ֺL 𻏑X!V³?]~.\9ݻ/I&.w6m{a1 so;$y3•Zi-b}AUd4^_ 3%=*_@q!dуIXan,>~zm=Rێ;$3iOέџv}3L+SuBSdT G F9V)bBVȎ\_=T}v/zU[A(Q\^>Fĸ!$Zk$-R\BsZ̥5PkX<ZaE.brTcLksJ-x/ZF>LB=# __bsA}x +He 2Q4lHʉ:GYfk&jMaN61e0%>ȳ-Rџ&kL{@nCNK\ցDs%gzrVA;ڍ5k6ͬ$OO`GĘs"/]nNyzܥ0+ߝ5vLt栂N|,AF1l-킾[,ebY c#7_Zir @gw o;0CsqT7Bp3^3N1˰={=q\zMmz5+/1"'2~ˆ旻Qlz惜lfAX]oO6jn#F g'ybtF~[O3[@e:uSaݷ9ܿ4=5ص/cAA--A/h b͸{B{f}zf2،dk V}A]9q<Z q qGf57P7@Hq^a/x>=a3S Oӣ361G9,WP @S]qS-9ON7海ayf`Jo^Bvk:ZFF?ݨ0dB*ߝe\ujLR`р̭jܐx1)kҊXCXn֮r M2y!F5J!AXnACpRe3`C ޺qjHCcI͌Vh@8ab6 CCBG)cZ@C(jfUC{йj+5(ZFCjHp@])Z5 P>c6UZiGCB@ȍ6y4-DKFzL6*amz!wy5G c3K/er~- J:sM.2}m;%M~U=l*/&E" Z#}( n5t2ɰI!!SD(u?TzwWAdz][o#7+ ds6xC`b$&cedɫ\俟bKZqR!j5YUbqLA7&an&2' LM R1CwɒN 0p~: yqt. 6Oo=?>͑HI_ [7wt9wm1p I$lC r)DOӱӥI aLH ҒZ!ZR+$`D1"a ֔z`##JM)u.E]s\ IUqR*FE/؁ rU@>ntaMͱȌ5;iD'.fBD}ѦB܆f4 y>૷y(?ƣdn :J1J2H逹 GL\ ڞU@I%[`^n(f֚.6x4unU_麑}6{} I]#VOom"zRXE]! tN?q?|޷M͝\܃|a.Rku,{*>e`ev޿}a"LjbtRn68 auUϱ.vDP,}T> {!/挶S@Lcy"*/ޗz1߯nʗ5$Bm,h bo{Ѷ`D+e@!D  ˆ]SDpg;ҪBRkt( @dE.b,/? x9 AfjW??kj &l 8.\=>TϻCb ?N8*ӌ0qGDiZQ1P fM01hf(3!JycrN^[IkNz؁p<$ 7030,.jL|aЏCwу,Dq $} ̞!wŃI޹| eʙ.2UL)#𹁧`ƽA?-yCo1v'x} Ri _$g(|6L &hi`@=iw̏ưl~"л3E7߿a[Pt\oϻu$o:ӷϪ;wu(/IS*<0aYh֛,\ "%#T2ϸ/+(*V;t蒒%|xI_w˄O˸X8C-Na_UʴOE)ɈG:SN.&N]۠ "Mj,Y+q,{*{pFX F._wpŀo&7#Rj|[bCR2,8l5}3fT6᠒SPj2 IԔS 5X,|zR:0RQZ }1 ? pwf2G؟q;'p*֨|0_BjYQ)Ȧ+Ab-)*T+6x r3Q9*ԊdM/yÞJ7fݕeP躑/wkKėy>qIb ^Sm⃰QRVkû} T6pOUrcXfԺN@b fkeX0i2Ő > (DMVHC :DZ@Ŷ(qul%IT&X+@ fD>7V|YXm`mӣ(fhqW߼l #cHQwԁ+ 13IWLtk} E]k͔L&LyiRd8CeS&"ʅv:4)1H2z8.ߚ)rLoH-);̤/Uz%n VX!)HRX ?[B8&V"mAP9p& H\C=}j0-Tվ:T-?H)ɞ'7>٦9:8a]_pd˱JM?/NrQ6]'| ”b\cgCBZQ(o&yJ_cWV|G뎮{6ksPt0\jV(96y`b@ ijOoGDKr5TIhgvmkhW4z钘? _J#TFR@eT{$4L݌$Z?V1Nꥲ1r<=/ZNW=ej,cGRX$ucy5Y~_&H'iW?z3}><!d>'o0msI?ur}s8ܜŽ\%ow=l|_]l3x2 BɉA4{[=:hFК q{ i]Ph%t 5 X~uwOKp.RQYQyNXKs~^խh G.hla;n1 EMq,^6ؾ6!FQ>H͇#l c WZ"zniiF4(R5,p fMbP e)fBZT #9C<8$FTaqA 'wFZCZ^ϧGbs'po}4]AOlɿښZe1>']f5.BJ3e~g Q-u%l91RN^8W4iRH9[V\X)gnX.\DS$m 4bv(hg|TS.E486e.nD ,7g*&8Ŭłq/p8Ō #T]="[hM`iư3ŗ(J e2LsY"d fOh;x0q,M{~[ 'W"{'I{&@3%59T2\F{+п[İPqR2"tNrA֥RQeX9Gf+[Sd8T:Or@ÈRjR 00ZiKK)A\HTuY[Ş45nV F PMUh ՁzU>]ùZTʖ/h)Br^61"N%Pj;I'La\N0BS}5?1 /ޣxa7<?,w? <{+xra}@g>g]L0d4>ͦ7<ԇD^!_+U῝k~ ڧd?@Cl # et>0Y]]w&'/.PX%n: =,4+0ɲ.qwAa V7/&ND`A=.`0Ω\ @$vVc+VE$ߣZ:!(EPB_Ggf~)Im3#53KC)7ijNI! 2/2K7duE]~fj`}|ak 3l;+E Z=hm`AJbA`\ZJUZþzX uw T)Nt0EpT π[TJ ;%y!u*TyhFm)[ leD-rRnZQ? C=;290s?秶 g ,J=qwA~<3`^fOӌ'>1!u|}'c2 kB|5W_< o?79*_%7CofN%ܿ8Ƀכ5=lO !q=MOC> ̭MI,co- ψ2/8Ǵ]z=PRz緪'Sh[UE;}m[fx q0;1DUjfFАfqRR: gݨpuxhO֝BHӅΣB;e ;*0##f$ׂ}hxܜ0T+=50 zfKldv笃zM[=͛`n<:K{]6iiJu)(z8[kct[g%MtPLD'R&:R\HjÃ)jAJi+}^+UB2>v )q$n;C/"r1/J쳪yݘ(p5[*<|_pwۆ!@ DkHo`]ZOHVC][FV|A~z=/u@˨4D ?ɜ|@>lV>b9߃bnP qf3iHzfd|[ồ gո&~Jj#9 P = 9un:#/B[`7.6޶@< E-NVd{ͳų*,-U{V C6[)pw+[MOZw4ن,SU#WP: YOh$$(Dl-՛σi (:ďx[fa^Vpt )9`Ɲ0 yЈI0"0YggKSStЭw.93ipR?! _xUzU2u IT/R%;@(둯-59{=ZǞ!bea1ve|@# JBk;k{`:ڛj^+َN޸sRJ^wcE=%F0bX&&H!& ǎz0 F^Rk ZB^ \.r(fjFV4+=Gэ Gb8zy8!a[l}W=&cʉկ/|RXr7VD;Rnpgn#? `L$K~e1?L]!=ai]r:6#!O\DdJ!O^nSnT6XQ}[dBZ&$䉋hLJjLfSnT6Xřmݒ jݚ'.)2%8uVbhR1":mԱnGnɄVnMHeJ q\5Q'1A%Ilu;޼@[TB+[$䉋hLtk!k [*UD':[Sfݒ jݚ'.+2E* !Tgiq{ķތ4\V͓?^Gg/'O)1,8vqYu3Rj=gB 3RUR3Rۼ)Q\A GFZ).]g~CL27Ld5.͙9^Z1$/a3R;D8~? ]s%}a\:,PJ{WlzEhsnoZ͇%e`+/uQCJۭzٳi.[RMݔ2DgS:5)]AS1D;9ESmBVDv $.A┓G,`MJ;&0^5m[B<cH* 4~yb!- =RpKdp7aܝn)23]k䝍 =:7·~َzö % %&Bd9mA4m̚ eQqѭ?"%oR*]tKmۚN3V&_e: 9W{. ԢT6gx."nmY^!tme$N4۟5[N'E/Y^񓟘JLf [t__#FD_"jֿ zmnx+-~ Խ˟guOHw;6ռիh v?"EK?| 񍷝E+-]tu ÿ b w A|4t.mv BBt(b3!\tuY"'Rr, heWfj%1G3 Rx{2bȏl]z9/d}1?t叠W&.ADi/kXjJbse`ͩL#yktenM0|Əz~Q[p]kSr`UsGoޫ)X謎s:(boES* uTiӆ*o2*sS !Ld 9a=< \jaR̮Q,Յ&LirNZNV9;9uOJ$(ټfYL_YEE0E!I쒟]K~vO%W<; ":|A:?{n}?JqL-&{ѵ,ջW(\#1Yk8vH4>N `ZVtmMz@OuW:Ң2nVE\LCՎIBD>g7!7V\6zI`:z$UscEf.8GK~whɑ!k1Zm!fD"m{mSHRh9TëxgiLsSW'ۺoAͳBz^ ]{xv[Mo)%iih; -}hrmh B%҂tK )F7ZyH隌H]8=j,3<#~xDcZNLdcHiΈ2PlGH,9vȀ+*6ђBD^RFՕ]܈@fL ͔f HBbxƭș m`YXo?avm;#)Ż<*֒yR. r>QɑOԭRCg\gҽ̕*HFpb=Ֆw\/Άe& d- _x0\=V UIKqA2E Y^XǴ0:H D 0+)%'ϛ7p֯&nCq?kʀG*^3/ClKm6TiajDiɆ4ɨu"l/9#Kf4ww1t=B]0E#ؙqoKV"Èā8lTekA vTaD@K3#>ZqL7|Az/2[CF[ÉAp-%383Nc!0F{2Kb2LcR#T`WZL*πk@Fcc#QT 4;}`6 a&P"碬mz4<@lRǂ#11 AXdD?np1A,,/8XMx>S)op9rFwƝTDž<8mTJbJ(}H)C[iz|S1ސ`ԼN=CzPGqN6( ̼¹+oϼ5cMZqkx; bv;JK͎v^OMrx v-bАp;e%4F:Fqb O̵Lf'J]{7q;l޴ȸڅqþn{ˌˠ)A `5m.Sa4A0O%G+%圲&!JP,W\ apRdT r=rDix\q"6G(6szF> e^[>9Ob8?b30|.14i @T.rFmMǎVyE㎤ 7QN'uO$/}nb ޷O;tAc}7Q'=׵bmԎ=wWK)I`nދЀ|>]ϼ @NdHPűp텄6Kߞ2iwbT]Wn2Mq@&CRٜ^׶~CwgJB2(M)BB`ڼLǶڏ~d[YO'8i2<ApsA&A,]~FI٤AbUX,?R-z?F~7 9^M- Q2raHeVeniVN?Y\I;i7da=85}Z%@bޗ>K1R0Ѫ;9T-"^ĩ~d{@f8mԴZ@'"ڿ(:(|xc?'{xQ++ĩgČC?.{\?^C2ʺϽ*B)nOrלO]Wh0Fm0 \r.TJ;|k2LQV:MNB"I;- [:e"QJBJ/tHZw&K;z\}_س7{Rpes%|v  \aϩLw'8NdoNyA9!еR|IM0IIİ8$ P̞լ:Hq/K?.IFSB) 8<.Jā-x&LuIP{Qܟw;]ߪZ+_ 8ssC`-`1 CMeut y*S H,hPs~ `)=S/a×ۆ5=MXwHi=yK P{)A[7>dM&$Fwef)A>v62$P~{p A&n Mo $@,l^fz@G%v;jdo`+S$"`Y_εG1 ڄZ`qҮ C;V~eK*.t9l|e2$~8MgQ24 )fO 5hX-(=@׫6kC?N#}5aN92- I},`fT{0Wt zI"OKU a@GC+eԄǓt8. t:2&ŋ ӰC {Gu+/N*daN. Dqlbi^j/v,|Ot)NmU /u[ӳvQVM!VE77ioQN˧a i;[:MS 8ߧRك@b^Fs%)᥆|.9^cVL ,nhPq=5.'@{u7١p jŹ$\ ܜ2}nJyw!@](wǺcSjc@Us}%":I֋p#DMjsv(LqaLxzav  /ahGVOsl 2W̨Tch;9(LP ߖZ B/-(NTR*eUrXz;h3݋\pBؘqOE|uA">g?32¿?Uc/$ Ua-dr]I^i[QtcS["JcQD*gS{JL-o\ mW)'֏p@.Bw)O4Mob60iEL܋ AJ8!F@ !lql3t#D*q+Jp[gFBE(@SXsf7.{9wJ%%Uhy38 Q@fiOƙL>:%|LW7* Ƞ?ՠP ^T|]V;P 0@`6V[~!)iA6}lMi;beI";o[Hɐ ]HY}k ٿHAk~ρݢ9r c}*%.TxrTet}@DK9(D>T;RHOgj@@`mg]-"&=m*6vU|:Ϧ&l Ąw^~ POEiO{0‚!OnOX83iqip)̓xXݿ#cOG+wh\GO!>>*ŌQwG"`~8n58RkIT] _v2~.uՂΕz]zS 2UNz ([q/uZ)dD,_%e GMDok'7Gؒ| {/*9hS)P` 4.A\AQh~#EטnF$Lm3iGнI$> 7}!Eө;Y˞}z2NW- 9}# Vlpf;$-49*0 @r&i /]ͻ^F5 ^9h-ɷ'ȒqO[(eX(FW9IF(&fwT|xY֓BxqOTqHo/r܁$}:. w8]1U^X24 cN 7Ԃ>P|m?sjM7~C͖߽7BJ6lLR#: OG٠PCl?Ⱥ'(>48EXU;IyUR .&+ `3ͽM;D8մsD<5__̙Lh9(+.sġү+-skƑw ^c5Ȳ)X_$EL0%zz\׎ڤs[Ʉ=iylZ ؙzb!TLQ_36 'K_ǖ-sDʧiB;ҟ ~ c_\|~N_/r/=SitOU.X!ht}@Xkz׏e4_ՕMx^\3fՠoM~// BH4xF{,cQly.v~JZkô]2黫+2(qd,IPJӄ2 b$' gPdfdj)X=.l'c{\oy eQ!\HJG<14I;:ZT8/b(B< CRƄ$D,'Ay~l\GIrHh3p"D&(1 2dG9'M%L1\*QDŽ+9GpXR" %YJ8A'׹1{纕ݎ05g1$\,}(gqBIZH$aa5Y™2AsYBp uK Q7N[CdTHሤ@9I&2XHv9qeIX . e .\\2ʪj$` T'y9^on0XT/0ϫTRpg&l@T _p1(EY:>-+z~QO?S3ŋ򿆓e,':Өa[/T r ,c\[sǚwI=$T|%H!-|>YnblvKg˯Z!@OSCYMc)2RJe S s~deRiAh~㕵յ%5>}"x|R {:6GSR3} tde9U6,IL*&?'$W.'\ N9]Mj䏃$wV@΃ˎ3X6Y"aZ/jeBzjw7 u}{Z"%Ǒ,KZRtTmʵKM_*?l5*KMft+mXwCο/2/s} U]_ťq<)E)WȓY=QV;tLtSUY3(;BQk3ER=f{mi!O#פ}< ֨Z ȳye6X,wfTdl+Aű yHI% ȯRۦRU\3)0ga6fQ|/TE_&sa9Z|OVc7 +q`lSZoz:Je\s3We`ʧ>B0H8e\`9܅l\LA=fl r(BFp{u\wGجE63.3b-e,B>Yӊٍ>[dk|(ɛ繁#3s3.B >3bWŷv n~"Bu贺;(:qQ{c%RwJ$\VxX5ߙ Lap2j%)~D("1| g^?|3=ioGЗ"CÀ> leؙq'3_&9 }_5)Cfu7[ĉf]=[jNx;5j%?\">AN-붌Aj{ /|͹;|v,#R23w;k4C&P1*9e?/BΏ.|,=sPNSowcW4YV`O3@ͯhCmuTJ5s{a5MZr.AFX t޹?!;ר ;Yu8c#aZf*!sƗ hNm3XsWzxt`e&mv(Z=uYv:~.@ \ڭa  D븒ȟsAيPH)ztIeg5GP;>ͅN#k~)nu'ጦShr)#ԾY3H,","K4ߋ~~h^C'wjI}1"B橱Jfpy/I XXA аot1mDBW:a<ޮn!j`7@ TM":SG)U&P ENS=S8ȌQ1WQ@&83B FE5Hq%5A0ЉFKL"uE UR^.Jc,FCf\rJD@k g!zP/Z+pڕ,cι@Bw`$Z*׆rPNE@oXW&_ucLa <: 8PL ; 8 Z$ Zh:ܮ0qV#7RyQq[>gOǰ{!8EPNzjys3zG;(Tkw໋\4}C[T,m'XPFSۈF(D$IRQ VBv!.b`Q`b&,JZ fT@d,rVnzx7t5R(M}oI>[/EO  Tr/V= Rp%_wUKҦ~XD| DZhoOFq2͗;UJc#8J5N?VnVOffzv=gWP5u%ci2 kz9+g6ry5!B !e]yv iRĎ?á;|P0NÖ<.}$YO%+?|'l&q^_'#3&5-^{s|VxbG"Y rSzSjkQNwoLS%Y0_͑tԻVTx Q_gR_*%_뻫!؞Y9|8Z Lm]-6.j_>"} {O VS\NJ!b ݕn+L\)/KTc*P,!ZP M>:,r!ԓ5,2KK6ZV=S>?w$>Gȇ͓deZZG7Y6#w}#n.=H{(;}>yu^/mȾw]|nz>g3VcD;sZrr/UļjW$r@*AvzSo<۔$|֢kN$zGx]k>}tG]!Rq}f)OUKw]t<-mtJS L"z4ƛWݜ2Pni&呃S„`/, VR ڮٻ=w L)"#^n^)?圙gI«n>ϫO]OR11l[|_&V)\@gS9F{΃S4;'+~~Qlԧ<Z=W~4IJZiƋ^@{P1A:. 7fQ,ϸw?~fM~H(\lE,ɫM]'|N5"RȔޑN).*bHma -%CZUZQ=VVͣj%y*ZaL-lQz0lSj=#`IZ#yv˩/Zc౅9rUcc81xѭKsr Ҧǂ aYṕ9oŲJۈP.8uUh1cO"dJ;`<3Ng`)c6MSMn)Oua͖9o/A fe[mD+ k6|7㠥z2'# g_M2 9E(5ƊMx{G.cU u6۝qF8 t#Ƚ7VWkܘpFMԨar.0,ZWmCyM-S.&:Y&LT9*m4d̤4ucbI+NϹ&d9o+dAQ#1&hNɧ嫄,Fx.SmY,b;_2[W?=Q':%z0kS2vxU%hj7WЈ'3$y̔juJ|pNs|*ir( c}UAH1lHImߠQ `~ql I: yIi']tve}'-pGSa)kQ g: ĴM"F@  ޠ?ތ.R7W1/z3ZguNzO/~td`jje%stJEQ"J VFjL^YA5[}6 A#k Q L(ܽ%̢XwLbjQ@'KhҽXDFkE42k;ą `07"(CQ uZZnttC ii7*y7x*HY۷Y&E`ps̪E%gyt6 ,4{-R qsI"2cTP-jN?w5Zo18Mȓʎ.ŷU~,_rmӞʼn;u~/3<13ASِxڙ k}KrO[(E1D t>-^s$\tUsERXjpS;??ze/fj(QU Dg_q&74.QdS 쨁v> IQ;W9KD,))7Ko#38soVH",DGL2 $/?[L?'!Ɯ/dh/}v0 F6(=EZSj h3JzΥB Έ2rM]\Ov6~Ekįkt H XWi]s{u]v+=:!#Nr T{11 h b^7\5+g5Z`x |8E;Kӗ$Q_0\}FP4+BokTď}gH%zX2q&njy?R AoBsA4/h^&ѼLyYu mRL*Rgb̕+C" ax*5"7Iu](}R}z 5Ez‚J F1xΔ5Az#-Qn${PPu8˽ _ А;N9Cn,hEdvgd7未F!4 bT!461*fAP R;.yI &rBF8oi7)sbE}[E2~y؇Vxj;, Z!X.:FXEA~ay:jq {\.}\[eEio0H#6KY>JA$B^Q]w jP}&|7O"bȸF? K|w} nS?շG7POUFZ1t26K=rT*9Jח;oݒR m%n)᛫i$=DE>%i)%Z@Md8e(,2AIQ,̀&6@[|WS Jkv! #¤1AQp[&c, aC$yՆ602$?HՖ<S%cYttA#vpRh}#iҷuJZBA|Y" {Q4u+})yĤj)% 3hRJCG1ccAw${)M<2G_.uǀ񥕹Z*^gIZ*35OZS0l@-iZQvoș{N}.mG֥/EX!}TZߏǐcT:6mȼՉq,|y.P`ell(9㤒* `fMc&\`4X[1qo%ȃG&@FbB"ZwD*Cs&o ܖkꕳgLvkOT?ƎǒOd:NVx'  L?&NJBئ`NL1QJQv!X"ݮ?^'UCusiV.fk|Qd> `MqG]M j Li WDXr9RlvKɴRՆ4Ս ۪{*)EdOxf+JN 9#;bn77Yƃ^k7]ulikog>ZI"ͮX9yٱ䶍m6xV*я!L)XiPp$HWCOb.$8 MAfU7OUQ<*v9wsI_@S>56/Ր+p2Vwk/4W̷+$GR*(BcO!Ǜ\v)4idZ|-.b;qJ &Eqo<o~V CGN\|{ #72>@N|ryzp{#?(f Otr+>?ϗ+P>ˤQ=ܔt}q:-h沼C.B6\aֱܿe9*}MpI3.=XT͟Pieon?[{ߊl _0޷:yQ@X{-֚ Iu4]M^1;o)z?ݐHnȹFǒ!2vѽT|"}Z&ϑh7QJQTlŤCS(IaHLrP 8X/**AS4R00u|7{'<ܭ_BQږOw FLi"YM(Ն3p%gM ddFw](?kctf2)3U|fNt~Mz)hp]kwDn'3[7ݠ[ ewD-Nlޢn;&_70݇MuJ9{s  Wְ ?+Ru)vrzGדް Ģ5 '}+8ozEE_jGf%c K:uYM1u lW H'_x(ٯ5jw rAK DB+L ӑ[9'A z.r#B`-E8"m& fQ:eQbk>~ԠDu:VyڮҎowFq1mS1{ ǛŃKIH.DG Vȥn8LK+/1H1{Yn)AtEglFna V4QR~~'lf;dmkK_ !rW꺕+F{^NO 4SQΥd%*Ήn?~׿gf2^ҧMt\,"Mo0/>텣C0=m[bӉ>6ann<&>]7Z+,::Nbp_4J[NZpG, x ?{K|+˄qBѳS4gA"}SUT$AJ_},!~?,=g0v\ /yV kiA -fI 8s6 |U0ԇ"f__-۴wRdss{J6f+AM<z?)<5q"[cpwBpŀ'nk 7yL &1xamEP| G"ػS.Ji9Olw$;I5`?g\>XܼȜYi\_dg4I@GY\bQ $W GѮY_-_gJ:uF2Eqϭ௫QWtq$ B&VRNrorrG AAusDTJp2BK!:jgrD"X8#R4@u?&]#YJi -w[xOQZXP)1(gRbUdR:|WؒLl JӝmO!Q^4.[mL9B2|wKp}W_sն6+ϣfe]\[N 4cSE@5-y-mW Nן7!V>~UȮ?\=8褐)ieMMTwCӽ!ڃ1 3^; =D n=`y)w"- 0ao,S搬p5.ƥȔv5.&3c!9̝j_fŋk} T Ol8Viϧk8uj\0D@G ,Z$]ϥ,ӍkpWQ{K(@ GN)x`D2q8"y"XF`q_ZoG2k "(Kb+°3HV[nq6T#Q*2*IĬg@B x.qw7-;] L19\e0Jpz 3>F. W7<z i01V@CVV>hxЀVA 2A PW1BBl{T mDϋe~8J{MQ~ʟ^)`1<xs˰z|hJ{cC[W.%Ʒ"f*ue[oRD.dVR>Z߾mEFR$y:kX)<zZ0%O,R(izYFS[!8glvxUץ D]Z yZ֩@,.A JOǏ%],g|v )6bO^r~%/AV_lp}`rsػ涍,WP|JjBWavRSSq2/)WhX\ˤd{&) ;(/߹vsԩg\,+`N|r̎fNfי(]YcT//yae5k{U,4j|ס MiK2gK1>)T)ݔ`*s%v?VziSΦYV7fu]lXh-:$9rebH; iЈRۓf鴜ܥŧ*k!V;pu_P_Em059[WlצQ؃ѻ *G0r9Ok3yoNCLs ;q3t/˖OQ ]d2KUj5\lG4uқZpCwor76Ҫ({cwbXm?8DT㰝il^W8-ZO}k6dM#L .z{j.-9.Lh gq;vA0<Ѝ߽Hw5~Vn͵)|{5N+s6񜷵;$|B?oG`vLu2FV $Kݧ'-([]mrqQ}||s^NCyq9s;upZ:aXHtHu/({yVI'r8.#C6޵gԳEBK٭+ 6Σ]8khDZFC`Ymhhۙې98j8j/?a5$<| ;}˺4Դ|{6of&-權LO7NΧQI3)ym2ݤwG aLC)fUC&y2ekį+f9v@sEy UecwdC^[k1S2³.!R&(J1I$R -Q܈i#'6K_"^hU]/ a\7/&q `̣%FkDl-B,lHExH` M!":0-E/VYtmlV,`i|el^/+M埂mޤ.떻b-&ﳜN?LrNdɯ>˛\u7wIt$A k enfv北Ȏď&Ťm[-g(N􅹢7Bx`Igֱ~ſa#*]7A0{'{wڮ t -OW4x7B94 6abjpʀӒ@E! 9DDZ8aH"5&Bd}wȅ;uDv1d?o?.j?cyU~9w#'Oʝk1ΰ\0\{uVr*>̫O[ιr r6]s[JdC' E!4R*&a"A| y#bJ=E{o D[{Z85s{V6y];)H<߮M=}w&{3흵8L4%$ "4e X ,L hE$ GIDsږF /}bb b%ƃ%%_,q; /v/\/b~ý//Vý/ /ý/V /v/,TJrIW M3!8"2ȪDbĕ0_IOQ/c~ͣPly`bb`bG4_ ԣKg7Vvcp"3B ]Ñ`n-2{,"`Mi\-ԟ;5*#}(jc#[_z\'"v֪Z(YnޮsUSpMJb_5C-1d,r[OujM#Ld*Aֈ0!%=w"i*2pOEEnݲ e>SE3U:XUL{Lh{m#}SvkP٠Pk(ւS]N,}yXV&Nkh(uJ74.Jc H wiƙD{NQ̦-̶)sO];*&} 7`q35$gڍ '`;#nz]o))<(Dg5U36~ә+oisyhU155%iX66Iuv>SO$:ABT`w`+q]g=vif~&:{/btlwnYݰm2 Ƈ]ä,-?p$ԙ̲@3vg %(yr~`_n8Ոg{fPNʷR*&;yYq[̮J!(l[Rj*$TYm3]jjĥғ\qA*xByu_1wN9O6!2B솧AUzˤ㜥~wqjE +Y~u"s[ubpP+x9]g2FpEJwU+t_.jԬ$־/X_uSkW˯QO]b?_S}0}&:6KYI璂H1ftI MƘ<'B2cO^$~L.B8c)5Q׏)G'٘_] Y13kӤOkU:cNt=dc'smŘ}Xܬ8FujN@:ƲeE6.ciQ qpyY܁M}j>|Y!BTLQ ^ 'B9Nd$fbǨq1u|>< 6.s:r^:5 :5Ў]9'm̘;[6[ 5AߏGA׸j@{1-Otv|'Y5%[z-jN߾OEAdozՔ;WL ,yIVA\| ,Pn+A7Un&(b|ۋmp]?g͆P|-n`HdnAba\9+6"<^U6[iuKK;q-6 >ʾd#^Hld۟MRW ExJηO>n 9enb9nbDn-*O-`CFU'/EXws[4=Dd׭ݼ7ZTU!gQ/r HűQ*aUKL"*˜a!NhR\& Ήn9Ŵ|1(:s!'JݜѢڭ 8rHHG*điQ6ad9:!*q!e8+y>*I276n;/EXws[4!Misn-ݪO9ZP3Zna9nh*Tc{E[2p){WZDIpW3#SE3ˌEdwsLEYsf{EL2p);yP :D{bPC,'n@Skw}7Z؉C΢^<|Jv";9ڭ,R7ݼ7ZTU!gQ/*ɯ jP P_08X"Dc2zb]MF?\5hL\+dWcZ$h;zj$"DŽ $ rL*8/yN_{ߣ5֝_DX1dP$"u>p c7P$H{WOb1<Ƙ+l1,1#d11J%qջRs ]YXcRI HŘ%11\$P&{cD =Ƙs (]YR$cT(w1fIi'1!Ę%e/L 1j%Abҿ3ccJI`lS6č1<ƘQ1fi|y1W+ Ř9j0h1"C1scV(%1sѠ$1AĘ 1f̶c̃1sTb̂b<Ƙs (1& bR1fXccJI1fI|1",1K`1<rnX"Xy.]~v]"xktw`n[9ܹE~>ayћ6x_On٫ ]]wrekD9QݵbwհV]p cAބqA,yˏ.tVYXp:hrc¡q1"0(abdKHR %L*RhEB>&Wbzy +%`='RTVӿ?M^H郟p>Sf.tǀUd#E<X$ϑ#_I pUo:,WdפW{/3q㗻0L٧oˏ< +Hdp> 9#S a̢] yOksTl_]`SoR -#+4VV'X"kĂeQSI"dau"A%?Y+HFpT H& #(њ%Q(E!NbXZ DH$S"Cf@$b"JLlQdE$E s3FC@~1E\Y>$a" 90 G0( \wR_w#"C5-mX^_g̓A<[pvADL5x/ F` KU,Lv[p{7O/Zw&YnIL 7Z0RIm m2Z$Xv?  XGD)҈ bJ)Q8bc؄D_Bѓ/r߫ҚvZM6jW;sEۍU7BwoLjDm~; y:&-e~#'7҈Fo~*%@(|𩦈L׃?~9#rn׻8]G4NJoEg췻rj;r7.Iw|1ZqĄHhU}ݟEՋ 03b{b=Z8q"ϞnWl5iLa.B:Lk&,\uFпy$/8>|9]$I@*Xme3+g;_b? !o"b Rf3|now06hC/&pR24 p||?"S܀i5uwbqofँA2-mKoҫ/6^KL %&"lit>6J_HVY"g `i_z[x1|{ fCycyd(&$$ZK% \}$b䨎&3OYovO̤ xЂgz8_!2!` K,kņDM_f%K$՝ŬK|G5+"####"Q+-_JOT (ئ ͹0oI#H?Z˻˸H6/L?CT.PCa-D`٨]RɔIc#'+%V=Ҋw-pfjmzrvN+FrB^nXS,-<3y [0lGc%6ܐ=(V𯗲-.2*LSpg[9nM!ӌTo#&CQ!#@t'%` 86jR[0)%S;&ci)l_1Mff˦RHh( 81m6&^.cI!lQ1I204kqf 2aO2~!S)C; 4L+u}+:JA!J4؎ NG ZSGpi(>b̽A`[#a{ Bs=z1 JbT #u^<6=7E{9fﰸ:K1yƏ*;6S "3Du1ȈH*eDSJJ;]Zw8vv1ZσV [X Z:+e; r9Sր+G7iPMͼ lzn 8*qxi ZJhA(v6psrr bū?I?_(_Гr)Lz۳l1 )_?_ HTNgSg9S>%er|"&EJD:vɩRmu!aM{|V1b,oA-mU;>bbfy0BXGl:`//?^Lo@%Hfwt܊;cF6sZw?/.]tb?d =󿘳x-K蚭6,A?;q˨42_~wdXJEtʯW#ƽĹJgV fA5"Rax1A]C-w3@䈣tC)w:h@̀|ή~#Cg`~)Po`QDp'ت[ ATIP* >U!"1V Էb*|DPS,RnmH BXpoVGEPGĠhĜR=pLp Wp؜JGvry㉓\TuJ!m괴 1R"F?^Z }H IE8Dssej0w`s}X[~f'5=z[Ra %gS4/O_2-Kg!?݂P!2QaRr_In,ScCYy2ct3mSGn2Z-S;J,#v;T,D&sAʐj/?>2)ɓ)E";bdwGij+TeVWس,~_Up}3X0Ov/ ɾ_0EJMy3$MM4IHJJ"o@ 4I,)*1ZkY^,&p}/d1 !Q@PJcR,*(KnjE$!X1 ҈a;iqAI/GM p-3x8c:J^L^,fOf׿&&Ivճ)RK[R'.ݰC !fIJ*Vs)#1@Řo 4JW6lNIo@`ȢҕWDK=PR:nS1D#GAz:VLIJ轰e9_^ +k4/o|gW9Ƃw(ٶ%p8BP9g. P;eu/~%C\mKCD=kxvמR/G=VOJOft5ȬNZm?_˷ooRIֻ%vfzaWI?Dit!]&`Bz»"_[b è|LyŔ`- 9 \#(FOcL;]N{ aĢ"P">MeUD"~ZR0f*XB]r\FP_ݜOS68뢟"}hzĿST+I,RJ V7ggjV;:ٰiWm@Z?duqHxeE=EVhWYpSO YfIJҳݷ$gq}'wWtiYLo?]K%T qCL8Ei#%W+V|(ߗ%djMɐ@DI۷+ם5ז%F/Aź]B2%Tcf},W%3IBgJE|>nBR<qJM;#ϓ{і{x=zpz?Drx. _yE` [ *6Xk杏~w͜!fVW &wsO̮}(z`fwDiJ:T:0a!-2\ꕃ=DbV,r۵ɏ_Q'j jcggŧ;I,R"i!h,yFA>sAչuѸ⃌B@E,CbLI7֎ zKM''2GTۏW I&a(i:i2/}U,ŪKx>~IfY`N_n-ÿe~ T7s"c.\3OPU+Y&:dJB$ӳ_l~Z:9qJ?9YNxw;#|27*isF@:r:]ΝNmZz2/ղD+ UvhK;rV᦮,7Jj&^6>׀zL`%`3d)q)ySߘ$FT LHk@ǎ;?+`_eYR"ˮI'Yc옌}v-ӒyEx+Sd( c$ Mt.G4~n.>||mwHgj#uwQ] Z v8[y-Xo-NE Vsqˆa>C כgC?{slOO~ZWƮT蚆_XܭݰaZ "zk8S2+Չ \br7{f;cFW]FBdo!ІRD;w1˒A= ^\-6CQ&*r 7)a%Rm=}tV bΈT'˗!Qu%+GzF=ٴq`yy'Ybhɝ >JY= KZϼ!':@-1o_וݨyoUZ'>0J^vH/"h[ FaVdrG2_)^UXZ zMEk<|zw9iZl?kgִo~~r=?ޮB#wgq=g|^OgFfY]= ׻ˋM߾zw(l,k_..nϿ֜d!']iN#lvŏwWWTVSsgδdӚa E}0k-ݿ6RJ:M#d=Yu/`ԌeeNzFI6JW +Jhs!,B6 X2H#SA hA`;b=MZGh](@8(_vXp~~4 ~l el+Y6hmNT&[X+*69E{=M#ց~`;Cnf((Y}0-gٗOL0_BZ_!-o~b~ ֆ~` #ʴFMm3;/>U4Pd+V&!/$w$C H 5PпֶdSg!75F mlCeh{]s{He&O.!]B[2v\ I+ޒx+w9 ]VǝE"@tqk$0[nm^I>Ng'j:eu{~,&ԯ$`ʏyg]< 41\nu~}s՗> t5:p~EϦI޹LW1A~?nG9.Lnbeqt] ~9t7=i:F;nR2[LWeQҧ^PZ`@~WCX>.MOT]V)ȸfjtb n6 v(?ٴg*>]N apn 9 -=}$ZBڊNcAN=e9 ~l7EVҾUh+?. fpQw TWE헝 o@na-RbY'}_i!?r^۷lfi(20ř\fF.|h֎9JPX`8 CLzYh !1 :Xg[77Dtq.>̪MY׎~ZҝIZ%HH,{# is:N\gs;B*c.jVyۙK,TIf|Y.80RnOCzՏCN:7$FQD*IYrIH z`e*sVT%RH]Һ X0-{z lQM^U יIguBj0kVZP*"Ns(X븫ҪknUZ Z#2[5| Z m梯Rn9yUXoS劕97]oe W&ΏK\4z~:7FfzP<'bb0P5)w+(|مF sa6#FVC\+GLFo|u[*Ó~2`{h昄T:96Xhgk% ݞ>NA- 0#U.ɇ ,bwծ y`,G3pu %:m3Y8{c=hkk9< Weѿ129 0ge~E#^2L)Ŭqw'5 pG+gb $JA e6h&q+]*8z!veaؖe:P"ˮl Ka:jJPQ \H@_?m(srNUP;=o;@v؟I DZ,YY_W2zc:t^[.K/&+qRP fң(Ʉ$kҟxқ{PJ`J/G.2gQ ww%,9p+k!0cS DpYAVE$~xFRR]EQuG9>p0weGiO _|j6_"طwŠOYpod1aLp/}dxI0 A"y+^G\π`W8JΡ΢P ε6.'5,jtU1FP轀M'GU K!dh=5HǣCSZ\W`sѓiln6HՐ|`sp- +04-}ς-χ1[lFH춦 CBBhk@hڔV[oS˟ˇ\ړq#P%He)+HȰ*(=.g sBmVz6=@/ޫo5'/ο]d*{=TV1 ..\0D6R=k2RYśMZOі[tu&,jע[ 4haZ2ϚÅTZYזY{('S` @#p= N{O.@pP@drI9_ve~ B#e*J @ȶMr< UH|$HB3<II%/5=KȽُOS8Gv"Aowgyw7xwtzj{gog5^iD:}ɟ{~@чM?d-_| QEg  !d߫ g-)'mnF!٤D *mqcd;y[ `dn(!)ي5fx IfxXf}ᚽsHuLmߗ#%+9$Jѐ`it-%sA5x \:"E^h,f_T=+e1!-^Kh:Y4BHIdKbWMږ:r:D(7hOnRbn^ދJ֩y,V$#=W,%Y%SO fHi;UƑA,BA+{vOl!U#B#FKO4N)AFfttCfFsvrCC59TW\˼?O]xfg\10mNŴSՈ8nCr)hN!S .g-CHj \Q 9}JT)Dg2"M+f XRͤ^^ZkŧA 00P]S۲`R3%+5XحEm\օAp @CdV͊,"ݏ+=^݇ UKR5&uRK03`]+5$ኅ#|oR,rj.ѿje4ȎH[]_;x1vdKN~u'$h.N"lƢX.N<=.DnO_$VzʎOOA{;b;h+8#ߨ5JN%xݓ!ݔ9&^eh&}5b]Ĝ/r,Rk#56I&"sd d3Ʉc-}L(PfY"CuF"t9*[tm֦ $a$eYBL"1hBa\bsW( D=Q^5/YFÑXaEk^ox9d91uA 8=404ejxkI}=awAIc;Goi'a (tah=8R)E fD>T5BSo(М[֪wCXA,{ >Ax !wDr\ُk_٨ Xi]u2\J}Kv.e5j8w# OH>ȷ{vxijџۖ p n#A83$ (yw6M{nM ")pm!a s^=b/ sm~ l߶h͚n4Npm-u\"4'UqWQXtbkN̞7Yy$reڿVo/\:&d-J̳2 U My_N2 _Z˼8RK{TM!{*BS5"y/@>G&_iLӻ)Rr "dn5@E*pMXUՖ{~=/rvtp?:sAߎ^89;I>|٢ jh5ŏWK}x-e\k^4۵|&f18y'o5PHq,o>=7`ۛf?p wi^Y6] nf _o3PbIuHME|8vu1OJ:]s|=+Ȉk?)܄Ǔe+0k氟Ɍm>ԺDz0*"UͮzHN.iV]6Hm;}8V+"W|8iA&^]@㇙ /L$Vse %͛DN{kD.a?~l_}9f'}lX2̯:A# K;7Ϲ|uܷϲu?&G:9жsمIy>D{Ի׾E̜}>{i ?hçb99Cͣg&+,~ mgQ-l3/dɭa^}>g+)K sac #gs0Kؠ H`ٞ<ùa僽_n tq?#ژ2;#…,Iy7+0?\UFTZֈ1ƟIćNuҬ}Ǐ g6L8P`ʸ _ɅxDт2#Os$cR6PE`ۀ%N*Ot+U9 Rƚds >Eȣ\G%rDW (1.I|p./Rj͵3ճ9s0DE u)S<:s?QG -SBx3>a I :Yx͛d+tX[ ɔ a+aт9#S:,![cj 9qCqS J(c {L'O:in Lu;ҳ׬u@f*Z8ҔĄSHi.Q*@%U\a wztDB7XXL 7F)t;V+0ңS i6n@27V3e~iͫ-Sy-<(9B3  e@e0J$֢oyC9,1b ?ܧzj(O^QpA ϒT[/X%"is P;+hi w>1 q,U "08A) 1H lٔ0&e"6}u(%Pw\ʂx(5Z=&a|p70vU.Q#-)i\]ycitOn"%jV#|HJ%yjg3v,IBd̼ x/JV JW@rp2;Fb$%>Z:?cG 1J5h)kVKmB-ᛝ!\S,PMnՎ[,P-pN'm=rř;;u#XQNM/m3s=ˡՋ>cS%i_%Z:DQ1vd"{1lud_IgDN"5hɪtwvmT\W}로(gru.)I0YN3njESw1oB޵#"e0JaV,c/ݙbw0ijHrzzߗdf\$;::d}_,YUM9Qsb4:Sܸ+bzOgשL2_/ =Ӧ)1iTV?ՈK6Z("ms2I2>inw~0N?WʶU~zil#ATȽ7ܩd8Ji ( iG6t9ZI&Ӕ1KC5,vfsE:np} NwF\I'qr6=ڰ( xlg]QWd?ϯGf_WJ&Fb/[.2{r;֣ JIQt"Zr 5*ǃIP#v1u%Rh+Yw#5ۄ\E=iCUUE +gBod©IRU+DI\9F\$Y#XP%dAmUq-]p5NX%5Pړ(^1fQp y;jd[׵"tkw+ESRtm]X@ 絳6#h~[ѭ׍#B+^AGuk5"Ω]{CJXmB PjVπip18[9YPH#NUچPFh.5އ+ZHp h;Ei~kФkup~phΚ p6;rN0W!2c4]o$=m;UvcX4򬯤Lťzʢh{_=3ш :=&H3rt<6]?Gdh58V=?no7~hw2{~joғmHR(E$u{9S>-Ɉ|Oc%w/k5 e3vj qy/O۴KS>fc;~?-158=ڌ}.M=[wMć$O dx s"irR è`  ڨC8i!"ibwd,oX8(_+ {k"!4E̗ 5_Q8<$n+)"٫F&P:A:BGQr0GZ 5n`:jNFWƢp4"0R9\Ӌ5˧pw7:q4cp:aW5!"mƳ(;Q 5 C0 iQn[KJnkcjs3) ܿ Lx>!8$B4 ifJj "rpRXL_[hTMe盡 *qZt^PAEy<|fBDq\f o0Al( e 4R3ۯ3ie'fbwO&j]H/=rk#7ٿ.bJ.fo/n?D{3[ۀh iETs4?GCcѫH?G"Dc<}Æy {=O/4wԏqvw3{{z7M -yۗ+ p6]y2gIc2O$3f R&v Ooqr ,T^U-HB6?}Ini7ͿY yԛKYI@OՊ7w~(Ũ~4s2=ځ_6=7]ۃ_Pۃ_[~Gެ_Rv7XO'9ӾĠv&u: B,<c IQ[}[Kq4.8-Qr-knn ~_>LÍ:+ffCKrNog=veƔfL ˜*7&_)ģi>NЩF,1@όuJ9Ι"OzNy» & y*T Gl_/--92 e *q5f[Ф먁4 tDĘ6}ZOh߿F dZt? Uƀ8PRm@Pnty# hw3Թ$_x$u vrye۠RwtygԾ0 Ä"$YXw>}N>d ďhlbmf_iBQ(^^ 7oZǤ,a![yDK}b 0_zXAN LRWMb @=/Iu*UVV%yQ6&,*"䊙 ?R0t)*'VUU~ʯl$+MрY-(QF W9/͒4A0yAԉ Ѿ|`iЅ&\L]W'g#s΁Hwj̑T>xOAHN!Z "_LLOG&̫=lPIUc$(%>D3}׭M9g'(C kэoaElϦMmDa*( `! 'TzP>,82#ThdD yξSvDidusSvk6R?>l=Y}:,J9t8xbMZaǨd0|\}"e](iHF d UGEtiS(Հ)r 4F|ar6֜KK9*O}7bۺٷ}loM\v`_M˙=e8oL#P94ϻNzn|mh-SyqDZ>Gy3цarà#{&tH% uʯَYZn8[ǣS0<qx ;>Ilnϙc<)q:=BקrO[MrPGj{=D_Kb}$ 3 JuDZa|B+PDV*£}1څI޲/z[-mve6Qgy=Tekawan rpN d1p_;&kpJIBj`*)B3E$)"gGֽN s{lR*=mrg-Cu<s@4 |Ր)ugOy4t]k;_.ZS8 աkTp|~?n21lF<$5"I|#}&J y:kZ,!{$}t:!@-%#z zZg-25ȱglMu@ x)i%f'5ƐW}z`bmN8$a28.7#:3ɢE(8Ԍ!cxnjs5z84zڡ/\~|L *.QU?\1z JQ'Iq2_ߓb:*$lvNŭ5|(7NQ;zLoj*K?3EK\;_NĚ-ل&^TNssQ.xdo(8AƄP4,~pIi3*vwo<Ձ1P\۩8 ^y p@X95T_\A `ǀ~lm^,kxWMZ~z*8W$u2ZH9exHs v >/Fٿ/VA ga˳k-=A3Ufrq/!`e{="1~xzEJGʹՑa OR28`9D QxF;%HN@j@UG,]UBRݾtWJN;nyA+#l,BNR"k!Q Yo~XSAʐb !Amr*NRP wHu*KS+4MRPi8Ip=.k}%ju+4i>+C|v>$܅41NvnL͍8uFZ)z)n2ϟ n?`k 7-`U:j!aC$2@!y@S"k>IJ8boK&\0hLyՒrf$Ž{U8tfn\;\; Зg;FéяadBu 'dzɇ{4cŷ6)UR|ig= jl-c" l阧͍`M>n"ٻ=ުE1(;5OÃ`3E[k_iutp{Bz*8)#."|L-d"C$STF)Jb}FqGMf  YbpH4}(?sQ,ʬG_gpd.p-c7eX!r:E !BxrXhg% ?tGŋ$/'q1ڭ칊+7xX?XAG1Wu*;ZC@;ҚiS^ދL{>ZnUS~j㠠ö{=j@I7;.U~Ǒ h2=<߾|c/oVw* o]v$&J sِ"s&BVi8jMkʊ.O54TY*dZk@ ]j!5iL%H`,eFAE_Mgf}8qyJ@-ᆪ1,bж4+d8=[=P𑈖-$!ƅʙMFuLo췺m=T%&gͺwL q."\!o:k 6tntʌ3 o:} o:p5ԧ8xgm>ӧ2Repq:ai3@PwLN>E>m]BO!Hwm~Ծ8e*˽?Jg<{fY&9j*HUv#@9g` [!:pa EI;05 gmDģ-*YvZqB%[C2'QZp\\R^(EXrN" *rM0e$UP*N 3uɐ77عX5o?ߓYvuP5{ 5Ѣ  ^7AyK+9GZ^1gfߖ{ZOω7K;|wgʏglQHFn]:͢'+f|Qo'5O6@K`?)Zm(Jʟr2B+"JR_Xn?;"vپ*2Wy#2f_!X3Gh~bu3w3s4@ʚ.殣@z,'Qv z31x$$}}#tE9O| .5' J'Щx>t>roz]jA1r[S]硾;}~k^k;sicy2*r3L3mIfii'.dj-9\-ɶvn'ncy":c1D;PLZckфjCHQ2Eqni7gYncy":c1D2H&n mn'.{TBk{@σ$G]Ct]ߩqk "g vHт.)V 6m aceR 쿌Kg۷ MtNI[*^beo1D%س&1d (|JPA;ܬ.^zz}-Љ?pՋ/nTpǼz4 m,&?e:8=}Upf }i^.~Ar Jԭ&27Ab\t.:Xj!A5P;\󌛴=_\UWÄComّmn.āe3p1 =uyʍ]}۞۶~:}!zO杻jfnh6ݱ8Ζ{)Գ=:za6چaڰ~Em7ڡ}+W&UdC7=Em۲h| 6e0Z[ۈl8sn{(vԘ+7e-ﳳ5 z*5r=jAB/sC;t #1X6v:VٚiRJg0f#xɝ,%㾰EzAR ',leZsxA*66H<DY;ӟc_ppNݘ/Z@dgщ;И9;76Ъ!`8]A>/qxq$kJnN #"T$󛫭!T FF2 E&z}ok|+{%?%tUpd#U+lz\wh |f5A6"|OOнOdnm۷ dON̳plyӕJp tӻuC[(j6S j,Qc 甞;ӡZ‰nfyb.g_&~6Br2v(q{߲;xZZS{fwf=#]Pq·Wo% \xgE<;s)sҎy”9Hp Y$H֤mNÉjy|AEAݠGqmt2K?Rq!Y_wl񆂠W)$%yWF:bo{zO` H#:$ l*>X8rSB~Ǵ]M7 !?nz<>5^7צퟯjS^ՎZ=-/*i.Ru{E,9]uU_ ŇZDg;GB#GmH+l{단Sʵ۳+=Y(szz8ooClޔ3oGHYtKiS2y(8rh\\)Y\4Mo;j:;^qkǜw=yАFRbt>9xW~ !CçqA]Ӗ٬S2AȄNkd[4u mbٗ N\`bկb"$>X[bAz3ֵC tgf86Tۮ '+;إE u8֞O7~3:r8ts~-}ޤ'ԇ3*w0ѺQ΄(:<ү9Binej_mŶ<7g]d7Pwֲ,6$;z3fBAJvݐv(vߓּn2ZklmmvOfwjaKi+fw(zbBP?4{C tүXF,ձqj p!t1`QbEuxmtgvфK@CG{7\_f=D.:LJH]d+˩f Ip c^~ W:ᗏkOYPk*~H[@^p ?+vO?ܬtW?'Y7xrww}Nya[.ŧ˫k$۷dv؋ߘy˫?\WܷJ9F 7}4jh4z5uJ֮V+ڱ3[V5g? X3owY+. ]Y31]Mj_9T[ Ec9kh)hsc4bn7䣱8R%LP1QM]-?Ac"J1Y12Rru>1>F-7v'"Z*d" oEY9[PGK\h|4bRQfaQ%k"GۍPxוrLH>PQtg)rNH?Q'F0E<n6 ~G٪D1rz] y#Z*Xz/K,ʥƐFC[U(QDE @Q]sr WzHE8[ u pٺv#GXGV\늕h=muc᧎鹏?kۨ "U~e{NL<)aCS͘N;2Kc*.49՚ 9Į Cz<: EOz#%N$Xcm8'DcgjnüL5`C*> )Ggc w qT"،<9Dƒ^h|Xwa:e}? gxm[A `&[[rC(vwQW8g~osWLYӂ)CLՒ<ܰkу5u)y5=L N=x0+#Ѧ ~zNܙ(GS Oy0E3mSݓ fU: 5̙nP3˙ 0@La7U%4{hv˽]-nӀmc/>zSL\z/QBG.t%fϕ.6Aְ! vg-O"<8`@{2Zg E)a ~$<B &vK0~Vn幵78O f^Z 7% \0YG.oB!\Nͤ@fX TrnbdegվV__>W;A媏Úlsck'e3 >G&(s`OSrY >vɆڹ1`jS[u 7%.HARKtQuMFM dJbRX`?{Ħc0ȠFA]vu຾w8ގLT^v/&b؛=RAyskǢ+DQ + BmaRoFDnŇTp1I#>gU?u # )%:@YDD'eKEpcB%BrE@` Bj0mc읾C)L2YM^=Œ}$XDž8pS۫7丱n_y ^m_s hxc&/9 =Dr21!S^0=aƩC:E sR~$u^Pc5248=NO]΍FHd8jhKmqɹX/0J@v>w`k}[CV,u99U}Ѓ3HC(s"xTm!Q @R]4__{euNgJ\p"X:tuPrDUR_>uŇ!p%h9SOq{uz)jP{);/|?LWD$ɝrJռȫd?돟p{\1m?φ򏋫>t(^"wХۇ._9K[n%WC6cۑG)5M^clYZGBy{]~:}ccz,vܲZsٻ֦6vW\|xo;_ʇ,lkfcrjf<㱏׻`RyZ-Fݝ!mf3nRo/3? 3~IRCN ^#sS*GZ^fx5ֳBIեFV%).13 u< Qb^Kxtߵ&fףNh% ƧS=.0:h͏zk)xUNM̛|fht36 -X4|UafS//RMf͝"dO~tv'(ͦ}j+8h޵oқx|zV] _T&ߏ;#و.*ETڿz.\vj;/@ ېϦ }ω؋9 gDl)$G\\!(78MJb·sŀJ/q\1 ٌ&.}*iK ^AՊwބ)<; %?{lCya}>H]zZDpJi*rtܹ?3-lJGEY6+zi[RdB<"UXsM*dZH*,;,#;*](s5CR% w6B PXQk_tڵly&)j !PEE-c]0%XY+(J "I0WB;KJV,a(Eˈ.[l-.v}Ş]jh2\&s`Zwع'`V3J{Yjꥦs:mq?Oz{\hA;۹f7=~7 G?܄ >|27hLHX/E3lpԀ(׫h5aqݻ$? <|Ɇ.gDGR>O~>[)VtHi\6!dQ-<>Zumz%bZT z$rUGW?j5ǔm?eɱAnN4rj@Gv{#D ~ WbpeXeV+xaVV+TșbRiz"L) OkK5}&̼yEmv_^ElS6++{j__^rW/HsJL!gKeMx'屛nj*'%8XA4.שӳ5%|Jk4knrsY; V&kTr[М]ő?+ko.|i8vR$na?fD"~vO:s݃ܜY呧J!o/T\zњ8"%Lo2׻($>#uV\:rhn%QjJEBڎ, 1Xv Hqj" [upa.H bQOH('@A#U!<`:(aHC%2H:hE`mr bOobH (*`P\Ndq:R$Hk%i)s<Ÿ_GL'KPh"@!PA'b:aqLH) E"BHQd], Ы . L@!2, ;0 F,$80p2QZ'UG%WRQ8*WwߙX,sR5%9JIΩL8Ma=QO"UJ]Mbb~n5Biw;* z_x}kޱIO e-*[Q T= %df[d7jY7-GވLRw0cވ`z~U"8b`AAn:WuӁ@77D7a7btG?Œwg?`="vU0Ua? fTbK2/Mi/K? _ C=%J;)@V?^2̄$3ڿb!K{_˲K= _ 0z*%'^Wj>Y {v{NWz_j́{JgWN!ĬNz^jb9*d\rKVRKtRsZoiYt]I.&*탼9ĕRFS_. 眄9G3I44TVF19CF@8iȿ&UH]6r* M-]}_$i' %_\ƺ۟ZHE>ł=EGv)RH!^kB^"D-zEZ1TR*$”2ZJlڔ?OG0RjH",傏 lT\d4Q c<y[_~Ubާ Ï #vo= Oᅧ"Iۭ@ˈQg%HFDWєG&QH.&fep2[nY%oG_KWa*߿ȁ0 qe4>XܝϬLK2 R4^C߃h/ؖ4RyuJtH–}aE)%Uj(B%xipJ5kz I)5IIBC`}HEG0a2^p ڬcJfߍSk!k,0 LsAaQJOc깂߇0P]AC:j3˅BeZ4R(2#XX,'c0E,!cNa`A.Ĵbճ*h,֚rX_Q Ybj5-gV=sf2pi* &`2BM2>+"@3Ԇw&,€z ov"Q~w9K6r6G8 cI#B"h`4ed^x\&'W  x\XVx eȮ~{g=؃Fxa9`0Uk&L".'À3 3+LǛc5 KY zc;M$,|cf-uNF#2JDt|VpfLp&7Ip2i{`ۻwvݯ[ RrFZF; rI2lL\abyй ~Y$Bb(tmOY*&aT#ةTpQUwYH[sݿh\ޞ}#7hӴ"c` ]zf)/zf=jPB{X}GYU#\اj. FjO*daL#˄z$R$jH"ۮKii̋$! /+`Ȍ[Ptsbc}6g_efCD^bbG,dg!־J$}ʂYYYlp]{$rQxמR: FXP:ܵמ_ QwtY.exZUMڛn]L2yT-B,cYgI"Z܋AކI,LmTGㄭNކx"MZ|pI%8^QDjeS%AUqܝEK^/_xTd"tFH:h$1Qdd0DG:ʡT P<2F}(WLň-,p.V۱0/U\NDeA PCg,X,l )U47x<26"E$4R R]da5 UJ lAoqJ ⨖#JA0$q̃Iu5 FN؍Gna}߀>+~5&0xSI^^%جiTS}"V`|MσVQ5h79@$Q^9ޠ^_l0T!{DIctDWAyc %flLly(aػig2B)㟫)0FGl֕&Szϡx'.؁#esaMe˴l/,l۝;x lAo61w߭wa \M$eN~noo֎8jٹ Nis'܇qc;ywytܸ5Szx򋝝zvg_^|8DÞtc6yý?y8o×Kϯv?e;tσcK~ڽjm{KO[;>۷HkStuiӞ]sӽ]nooaIz_7ٮݫm}su~oy9s3p>Ϻ%wٸ:?۹uZzmWGa===W_>xg_~l!O }¿]]ߝ?'98BgHL# jm] K. 0 uon «ٻҞ6WP>ܫs{_{D/ [›+goؘc$${w=USKWuӶRG]4?_mt/EyXY<}zV'BLH}B[+gN}'^m/k Ή<<+_lniydDgە~C\VVzun^6xq k-7VjjriŽV՚n/7Vaҍ<1wQۇGmcsmeס㥃m+hm_r}rXuU-Yjrwx9mjՅUq|tan;w%>Z;ZWRKIeu?U;QۓKwrCƥN}iuu)D|cl1>>buAcvux|;dr|kYҷ_ bUV1NuVN݃ V6^:f5VZ:WN/@]w/֯u.Y}\nj ]?j8={~mH~t`]]1HbuWecn#\*{}__Z?].ǟtҷϫ{?J{u3=~O t;딛-ڭj}F MF-We^~I3$a@딺09l*Wtm|*v\BW-oڮkw;Q:a6 {.t́o7;N;ӿ[Lz3L7#8&Y#9|}y>^ &)R2`冈vss,O?0U>F Vz&#4K!ۗ6Wqn5Jiܛ|΋RJ}+ӪߪT6f½ 5>k۰Ae+|OyZvsy]C`i\\n7lZ=:>Oq8zWozӟ)Oq =[W|RR+D2ri8eP\K윀^m8_#!Nsv릸/FPt=װjFp.HwׇTb̬fVʳV^1- {6%63P-n=fkهQ[Նq4 s;qv3f*>\Na9JBcMH`_=홼Ô$>U]yU6ЅNStGQ{VM1n8GDi1\#uZN_C3NJk5cBI `8 ]kFm0}v Pn@Z{LJ)82_Qcu)` j^ W Dd8:-7jRy81Οࣴs PdO)NmȕQdՉF&J69OV)x J#(&wH~ULϮہ~AtKT~+g)9i&1ZdΒF 6oWD{Ig蒄WXKEBHh$MH8!Q9Jd 9 G3pGGW 2c ' *˜4DIC28o 3p;0(x| [,XɵQOP Q3V`__ib} 24c͗Oj&n88Uk<~J#ŻG*tNjuFzjvqRzٓ fw+үGRk.5Y]"tdF!X8lPaRGjRҕ8cgBz\wJ'1*(cv )cd.4:k]TJJZJK +b0;"|BQ"EI)`'o1ܰF??T9_Pʣ$ivD8H:WFVYZ ’)_%~OZs\m=J#Q܃Iĕ,HDx" &j[A?|-x 07o??Bá3LҨ/j4Nqذ MLLPNJ$A@EIKg˂_蹏@nJ)-PM# @|95Xg??q. ? hjfR*5;mPE&EtRD@%-IXęàbD`΄rC4Uf=.yB u"cvQ(.mäA-L ]oqf|\Jk}⌓HS <`"cuÀe[Mh8pI6Ƣ^[mJf9bb g cB"9\1ݑ4^) gܨ'!"mGfͷJ:rk K⠎)<:$ܙ)1՘+2#zϒ&J[(g>RH$bh:FG#r.q%1 IIvx~qqyޙVʬx +jR^9v1Ouix*[7:>6<ԡA4 WnmFX'4p4Y Zu9jk0bc2;^c뫓&xeX:F0ǐi2g#p{D<.3x&4f̼qauP#?I VB`F N"^"{x`Hm#7`f٧':5vla][; Pr`vM4<փ& D {~z2?pLzlK4zcrF %f>7 &Vz L-[0F'Q9),6;1GC1uBlOF{ivz 6ұ?FRw+M>۟fʥY:د{ʇBțpgq~o陃B\c:hs^ V!GģOȻ F6Hv)i1zC;2B)]3=\fo9,RpފmV K>N9f;q9;"TD1Vh&:RTr07BhGQib.F9da\AC,O?DXEHb9]ЏRr"K0c('(( J( TYbr,%Пyv> ٧Xd{YN^7eQ'HetSI^A+ן\&tE2 _VQ‘ă IwsC62"5ht$xDx,hwT0LY4<U= K (A xPD%8/)VF"#˫)W`2ț|ԭFOU> s5:5i}IS8:IIayJ}f#9X%P[$"Kfѡ)a@$fHa4 4u4K3GxUR$8tZ'DDĭюD')M#RVVFw)Gj>r P24' be$*DFwJE}iJ;BK0h;~3MMD??pf8X8 S0d,J2.ɶ.R"IT4PЊ@PA9D N#DL9!F ZO@ RQ %AD9]LA6B8G}U%q?( 5G@v63Qt'M^T Y~xYKnp-j5E/G5hู Id7!{6*!M!S_PϜq2r;DH=$r[EGN sI(·ƢףVRešq𥹥 D %_ 2$EpԶr%ōpWnzB%ϥ@znD<}`&@ERD0ȤDJ'>&{Rc]#)Gß龢 #jZq1CD/ؚ-Uh=Hb[YT/h!7A[m8j\|v=eKZyߞX,h: nJXr뢷z:LxE:xL3&v¬nt}&ʤ v=6Ӕn<͸mOW7F;t ԏ54GvW]mWqhu g1kP9W<-pd*E욣Fvw^×UFRB-LpgA&>,>7^ ï~o,Յ'y;J%:* OvYPq ((_Inɰ/>|g@|`/hޚnYՅW[p`Sc9y֌p()<3kNt"|Xz4k%vUtF,~<8!Ddڰ\~;( '7}iT+icWδ~*tO؉/y pw!|w2=թ(.>}:3r{{ 3 㽟XYP$!/\DȔfu#B5Š4}G,RG᮵[ڭ y"Z)S1wR7&LSծr[L^pfpr[N_YkxI[xF T^B_wWWhA64@='nJI4ᇥ-ZirdaE; L=Zᛥp64dJrX۾VEawoQd~tG{ O~q$ RQ=U)s|3+ ,i="J3OCDG\4Я \#Ba9N=t.(\_i|?%MJ䜝="j"Kj[݄ۡ,%>ir((],oܺq{sP+D]>'s67?dwM:!_牭~,P|84JI# KJj 1hE"h *tl%,T0ZRΥ$$w ,,Kh+3&Nke2nڐ^Cn/3+-17V?#`)X\uE"q* L\%B)oNPP)]R.#ӭSFJ#=}=r 3ȟ79D.Ayk%:`m"WGVYTJ7.K) 귤&se+gh#.GҝYmd򨽳o_b[1]q0\ڙ]#y"^-š``Qn|{Ud{J&į\`~Ñd5ߩ)u3wZgeTj㔉P-6ZS&Bgxa<2uIuLRGV6Vp|LL!fưR=iæzp][V{CF>-Xʹ𩶠Osu 5FI)NH3,@RH pd2[ Bttwvar/home/core/zuul-output/logs/kubelet.log0000644000000000000000003576266715146720042017724 0ustar rootrootFeb 23 00:07:20 crc systemd[1]: Starting Kubernetes Kubelet... Feb 23 00:07:20 crc restorecon[4706]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:20 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 00:07:21 crc restorecon[4706]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 23 00:07:21 crc restorecon[4706]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 23 00:07:21 crc kubenswrapper[4735]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 23 00:07:21 crc kubenswrapper[4735]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 23 00:07:21 crc kubenswrapper[4735]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 23 00:07:21 crc kubenswrapper[4735]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 23 00:07:21 crc kubenswrapper[4735]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 23 00:07:21 crc kubenswrapper[4735]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.984588 4735 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994477 4735 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994510 4735 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994522 4735 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994533 4735 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994543 4735 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994552 4735 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994560 4735 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994570 4735 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994579 4735 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994588 4735 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994597 4735 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994607 4735 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994616 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994626 4735 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994636 4735 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994645 4735 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994655 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994664 4735 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994674 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994683 4735 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994692 4735 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994702 4735 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994711 4735 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994720 4735 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994729 4735 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994738 4735 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994748 4735 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994756 4735 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994825 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994837 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994881 4735 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994890 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994898 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994906 4735 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994917 4735 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994927 4735 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994935 4735 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994944 4735 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994952 4735 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994960 4735 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994970 4735 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994978 4735 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994985 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.994993 4735 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.995001 4735 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.995009 4735 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.995016 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.995024 4735 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.995031 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.995039 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.995047 4735 feature_gate.go:330] unrecognized feature gate: Example Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.995057 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.995070 4735 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.995082 4735 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.995098 4735 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.995109 4735 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.995120 4735 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.995130 4735 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.995142 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.995152 4735 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.995163 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.995171 4735 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.995178 4735 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.995186 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.995193 4735 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.995201 4735 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.995228 4735 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.995236 4735 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.995244 4735 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.995251 4735 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 23 00:07:21 crc kubenswrapper[4735]: W0223 00:07:21.995259 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995413 4735 flags.go:64] FLAG: --address="0.0.0.0" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995430 4735 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995446 4735 flags.go:64] FLAG: --anonymous-auth="true" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995457 4735 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995469 4735 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995478 4735 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995490 4735 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995502 4735 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995511 4735 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995520 4735 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995530 4735 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995539 4735 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995549 4735 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995558 4735 flags.go:64] FLAG: --cgroup-root="" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995566 4735 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995588 4735 flags.go:64] FLAG: --client-ca-file="" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995597 4735 flags.go:64] FLAG: --cloud-config="" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995606 4735 flags.go:64] FLAG: --cloud-provider="" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995615 4735 flags.go:64] FLAG: --cluster-dns="[]" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995625 4735 flags.go:64] FLAG: --cluster-domain="" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995634 4735 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995643 4735 flags.go:64] FLAG: --config-dir="" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995652 4735 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995661 4735 flags.go:64] FLAG: --container-log-max-files="5" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995672 4735 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995681 4735 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995690 4735 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995699 4735 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995709 4735 flags.go:64] FLAG: --contention-profiling="false" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995717 4735 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995728 4735 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995738 4735 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995747 4735 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995774 4735 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995783 4735 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995792 4735 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995801 4735 flags.go:64] FLAG: --enable-load-reader="false" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995810 4735 flags.go:64] FLAG: --enable-server="true" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995819 4735 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995831 4735 flags.go:64] FLAG: --event-burst="100" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995840 4735 flags.go:64] FLAG: --event-qps="50" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995880 4735 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995889 4735 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995899 4735 flags.go:64] FLAG: --eviction-hard="" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995911 4735 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995920 4735 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995929 4735 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995941 4735 flags.go:64] FLAG: --eviction-soft="" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995950 4735 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995959 4735 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995968 4735 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995978 4735 flags.go:64] FLAG: --experimental-mounter-path="" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995987 4735 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.995996 4735 flags.go:64] FLAG: --fail-swap-on="true" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.996005 4735 flags.go:64] FLAG: --feature-gates="" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.996015 4735 flags.go:64] FLAG: --file-check-frequency="20s" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.996024 4735 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.996034 4735 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.996043 4735 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.996052 4735 flags.go:64] FLAG: --healthz-port="10248" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.996062 4735 flags.go:64] FLAG: --help="false" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.996071 4735 flags.go:64] FLAG: --hostname-override="" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.996080 4735 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.996089 4735 flags.go:64] FLAG: --http-check-frequency="20s" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.996098 4735 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 23 00:07:21 crc kubenswrapper[4735]: I0223 00:07:21.996107 4735 flags.go:64] FLAG: --image-credential-provider-config="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996116 4735 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996125 4735 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996134 4735 flags.go:64] FLAG: --image-service-endpoint="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996143 4735 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996152 4735 flags.go:64] FLAG: --kube-api-burst="100" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996161 4735 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996171 4735 flags.go:64] FLAG: --kube-api-qps="50" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996182 4735 flags.go:64] FLAG: --kube-reserved="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996193 4735 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996203 4735 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996215 4735 flags.go:64] FLAG: --kubelet-cgroups="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996226 4735 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996238 4735 flags.go:64] FLAG: --lock-file="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996251 4735 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996262 4735 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996274 4735 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996291 4735 flags.go:64] FLAG: --log-json-split-stream="false" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996302 4735 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996314 4735 flags.go:64] FLAG: --log-text-split-stream="false" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996325 4735 flags.go:64] FLAG: --logging-format="text" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996335 4735 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996347 4735 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996359 4735 flags.go:64] FLAG: --manifest-url="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996370 4735 flags.go:64] FLAG: --manifest-url-header="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996383 4735 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996393 4735 flags.go:64] FLAG: --max-open-files="1000000" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996404 4735 flags.go:64] FLAG: --max-pods="110" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996414 4735 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996423 4735 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996432 4735 flags.go:64] FLAG: --memory-manager-policy="None" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996441 4735 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996450 4735 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996460 4735 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996469 4735 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996489 4735 flags.go:64] FLAG: --node-status-max-images="50" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996498 4735 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996507 4735 flags.go:64] FLAG: --oom-score-adj="-999" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996516 4735 flags.go:64] FLAG: --pod-cidr="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996525 4735 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996539 4735 flags.go:64] FLAG: --pod-manifest-path="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996547 4735 flags.go:64] FLAG: --pod-max-pids="-1" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996556 4735 flags.go:64] FLAG: --pods-per-core="0" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996565 4735 flags.go:64] FLAG: --port="10250" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996574 4735 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996583 4735 flags.go:64] FLAG: --provider-id="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996592 4735 flags.go:64] FLAG: --qos-reserved="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996607 4735 flags.go:64] FLAG: --read-only-port="10255" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996617 4735 flags.go:64] FLAG: --register-node="true" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996626 4735 flags.go:64] FLAG: --register-schedulable="true" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996636 4735 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996651 4735 flags.go:64] FLAG: --registry-burst="10" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996660 4735 flags.go:64] FLAG: --registry-qps="5" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996669 4735 flags.go:64] FLAG: --reserved-cpus="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996678 4735 flags.go:64] FLAG: --reserved-memory="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996689 4735 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996698 4735 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996707 4735 flags.go:64] FLAG: --rotate-certificates="false" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996716 4735 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996725 4735 flags.go:64] FLAG: --runonce="false" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996734 4735 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996743 4735 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996753 4735 flags.go:64] FLAG: --seccomp-default="false" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996761 4735 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996771 4735 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996781 4735 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996792 4735 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996804 4735 flags.go:64] FLAG: --storage-driver-password="root" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996815 4735 flags.go:64] FLAG: --storage-driver-secure="false" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996827 4735 flags.go:64] FLAG: --storage-driver-table="stats" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996838 4735 flags.go:64] FLAG: --storage-driver-user="root" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996882 4735 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996892 4735 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996902 4735 flags.go:64] FLAG: --system-cgroups="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996911 4735 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996929 4735 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996940 4735 flags.go:64] FLAG: --tls-cert-file="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996951 4735 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996965 4735 flags.go:64] FLAG: --tls-min-version="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996985 4735 flags.go:64] FLAG: --tls-private-key-file="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.996999 4735 flags.go:64] FLAG: --topology-manager-policy="none" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.997009 4735 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.997018 4735 flags.go:64] FLAG: --topology-manager-scope="container" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.997028 4735 flags.go:64] FLAG: --v="2" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.997050 4735 flags.go:64] FLAG: --version="false" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.997064 4735 flags.go:64] FLAG: --vmodule="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.997080 4735 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.997092 4735 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997369 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997386 4735 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997398 4735 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997409 4735 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997419 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997428 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997438 4735 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997449 4735 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997459 4735 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997468 4735 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997478 4735 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997488 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997498 4735 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997509 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997520 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997529 4735 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997540 4735 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997549 4735 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997557 4735 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997565 4735 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997572 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997583 4735 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997591 4735 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997604 4735 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997612 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997619 4735 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997627 4735 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997638 4735 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997648 4735 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997657 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997665 4735 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997673 4735 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997682 4735 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997691 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997702 4735 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997710 4735 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997718 4735 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997728 4735 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997738 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997747 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997756 4735 feature_gate.go:330] unrecognized feature gate: Example Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997764 4735 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997773 4735 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997781 4735 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997789 4735 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997796 4735 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997805 4735 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997813 4735 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997822 4735 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997832 4735 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997840 4735 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997880 4735 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997888 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997897 4735 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997905 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997914 4735 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997924 4735 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997934 4735 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997944 4735 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997954 4735 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997964 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997974 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997983 4735 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997991 4735 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.997999 4735 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.998007 4735 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.998014 4735 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.998022 4735 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.998029 4735 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.998037 4735 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:21.998048 4735 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:21.998072 4735 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.010312 4735 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.010357 4735 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010507 4735 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010527 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010541 4735 feature_gate.go:330] unrecognized feature gate: Example Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010552 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010562 4735 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010572 4735 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010582 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010591 4735 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010601 4735 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010612 4735 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010621 4735 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010632 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010642 4735 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010656 4735 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010671 4735 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010683 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010694 4735 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010704 4735 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010715 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010727 4735 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010738 4735 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010752 4735 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010765 4735 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010777 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010789 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010803 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010814 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010824 4735 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010834 4735 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010844 4735 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010886 4735 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010897 4735 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010906 4735 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010916 4735 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010926 4735 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010936 4735 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010947 4735 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010961 4735 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010974 4735 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010986 4735 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.010998 4735 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011009 4735 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011020 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011029 4735 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011040 4735 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011051 4735 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011061 4735 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011071 4735 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011081 4735 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011090 4735 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011101 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011110 4735 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011148 4735 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011159 4735 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011169 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011180 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011191 4735 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011201 4735 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011213 4735 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011225 4735 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011237 4735 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011250 4735 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011260 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011270 4735 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011279 4735 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011290 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011300 4735 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011310 4735 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011320 4735 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011330 4735 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011340 4735 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.011356 4735 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011643 4735 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011660 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011672 4735 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011683 4735 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011693 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011703 4735 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011713 4735 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011723 4735 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011733 4735 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011744 4735 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011754 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011764 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011774 4735 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011784 4735 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011795 4735 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011807 4735 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011818 4735 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011828 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011837 4735 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011881 4735 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011892 4735 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011902 4735 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011912 4735 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011924 4735 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011936 4735 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011951 4735 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011963 4735 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011974 4735 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011985 4735 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.011996 4735 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012007 4735 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012019 4735 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012029 4735 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012039 4735 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012091 4735 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012108 4735 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012118 4735 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012130 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012140 4735 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012150 4735 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012160 4735 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012170 4735 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012180 4735 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012190 4735 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012200 4735 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012210 4735 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012220 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012230 4735 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012241 4735 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012251 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012261 4735 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012271 4735 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012282 4735 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012298 4735 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012311 4735 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012323 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012334 4735 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012346 4735 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012356 4735 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012366 4735 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012377 4735 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012390 4735 feature_gate.go:330] unrecognized feature gate: Example Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012400 4735 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012410 4735 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012420 4735 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012430 4735 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012443 4735 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012456 4735 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012468 4735 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012480 4735 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.012490 4735 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.012506 4735 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.014041 4735 server.go:940] "Client rotation is on, will bootstrap in background" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.020766 4735 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.020986 4735 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.023126 4735 server.go:997] "Starting client certificate rotation" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.023177 4735 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.024408 4735 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-11 17:48:05.097759719 +0000 UTC Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.024515 4735 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.052385 4735 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 23 00:07:22 crc kubenswrapper[4735]: E0223 00:07:22.054240 4735 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.059018 4735 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.085394 4735 log.go:25] "Validated CRI v1 runtime API" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.126201 4735 log.go:25] "Validated CRI v1 image API" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.129342 4735 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.135689 4735 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-23-00-02-39-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.135738 4735 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:45 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.168026 4735 manager.go:217] Machine: {Timestamp:2026-02-23 00:07:22.163958087 +0000 UTC m=+0.627504148 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:34aea2d1-4777-4ec2-a0dd-7ee942962cf5 BootID:fc670e79-a4c0-4f94-a41e-9a217a93a98f Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:45 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:27:d4:67 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:27:d4:67 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:3c:6e:24 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:cf:42:9d Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:81:c1:09 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:6c:02:11 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:6a:e0:f2:c5:80:b9 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ae:63:75:b0:50:41 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.168483 4735 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.168725 4735 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.170287 4735 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.170611 4735 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.170676 4735 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.171039 4735 topology_manager.go:138] "Creating topology manager with none policy" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.171060 4735 container_manager_linux.go:303] "Creating device plugin manager" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.171540 4735 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.171590 4735 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.171881 4735 state_mem.go:36] "Initialized new in-memory state store" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.172022 4735 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.176272 4735 kubelet.go:418] "Attempting to sync node with API server" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.176308 4735 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.176348 4735 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.176370 4735 kubelet.go:324] "Adding apiserver pod source" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.176389 4735 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.181214 4735 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.182010 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.182058 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Feb 23 00:07:22 crc kubenswrapper[4735]: E0223 00:07:22.182107 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Feb 23 00:07:22 crc kubenswrapper[4735]: E0223 00:07:22.182207 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.182777 4735 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.184508 4735 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.186488 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.186532 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.186548 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.186563 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.186585 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.186599 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.186612 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.186633 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.186649 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.186662 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.186680 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.186694 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.188373 4735 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.189050 4735 server.go:1280] "Started kubelet" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.194085 4735 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.194078 4735 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.195047 4735 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.195045 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Feb 23 00:07:22 crc systemd[1]: Started Kubernetes Kubelet. Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.196792 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.196890 4735 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.196932 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 08:36:14.9875208 +0000 UTC Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.197124 4735 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.197158 4735 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.197271 4735 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 23 00:07:22 crc kubenswrapper[4735]: E0223 00:07:22.197615 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.200121 4735 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.200164 4735 factory.go:55] Registering systemd factory Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.200182 4735 factory.go:221] Registration of the systemd container factory successfully Feb 23 00:07:22 crc kubenswrapper[4735]: E0223 00:07:22.205953 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="200ms" Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.205999 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Feb 23 00:07:22 crc kubenswrapper[4735]: E0223 00:07:22.206126 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Feb 23 00:07:22 crc kubenswrapper[4735]: E0223 00:07:22.205888 4735 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.213:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1896b77af7d9c33a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-23 00:07:22.189005626 +0000 UTC m=+0.652551637,LastTimestamp:2026-02-23 00:07:22.189005626 +0000 UTC m=+0.652551637,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.208204 4735 server.go:460] "Adding debug handlers to kubelet server" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.208502 4735 factory.go:153] Registering CRI-O factory Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.208538 4735 factory.go:221] Registration of the crio container factory successfully Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.208576 4735 factory.go:103] Registering Raw factory Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.208594 4735 manager.go:1196] Started watching for new ooms in manager Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.209229 4735 manager.go:319] Starting recovery of all containers Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.219661 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.219735 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.219757 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.219776 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.219793 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.219811 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.219829 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.219846 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.219893 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.219913 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.219930 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.219949 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.219969 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.219989 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.220006 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.220027 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.220045 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.220062 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.220084 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.220101 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.220217 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.220237 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.220259 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.220281 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.220301 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.220318 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.220343 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.220392 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.220412 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.220429 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.220448 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.220464 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.220482 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.220501 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.220518 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.220534 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.220553 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.220571 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.220588 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.220605 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.220622 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.220640 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.220657 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.220676 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.220699 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.220716 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.220733 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.220750 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.220770 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.221557 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.221585 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.221629 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.221667 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.221690 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.225328 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.225388 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.225422 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.226338 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.227333 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.227407 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.227434 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.227467 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.228045 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.228073 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.228106 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.230322 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.230522 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.230728 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.230938 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.231113 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.231346 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.231607 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.231785 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.231983 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.232162 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.232364 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.232533 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.232699 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.234128 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.234218 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.234240 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.234262 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.234282 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.234303 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.234324 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.234344 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.234364 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.234383 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.234449 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.234479 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.234506 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.234530 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.234550 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.234569 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.234626 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.234646 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.234666 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.234686 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.234705 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.234724 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.234743 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.234761 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.234780 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.234798 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.234838 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.234889 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.234920 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.234942 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.234965 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.234986 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.235017 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.235041 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.237367 4735 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.237412 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.237435 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.237456 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.237477 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.237500 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.237529 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.237550 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.237576 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.237595 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.237645 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.237663 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.237682 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.237701 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.237720 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.237740 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.237759 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.237779 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.237797 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.237815 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.237836 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.237881 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.237902 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.237931 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.237976 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.238003 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.238028 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.238071 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.238091 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.238110 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.238130 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.238160 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.238195 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.238224 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.238243 4735 manager.go:324] Recovery completed Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.238252 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.238486 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.238542 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.238573 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.238606 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.238648 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.238681 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.238710 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.238749 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.238777 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.238805 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.238833 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.238895 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.238937 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.238967 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.239004 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.239042 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.239071 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.239103 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.239130 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.239158 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.239185 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.239212 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.239242 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.239268 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.239295 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.239324 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.239352 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.239380 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.239408 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.239436 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.239465 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.239495 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.239523 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.239558 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.239584 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.239612 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.239638 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.239665 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.239692 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.239719 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.239744 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.239771 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.239794 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.239818 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.239843 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.239915 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.239942 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.239969 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.239996 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.240025 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.240053 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.240080 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.240107 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.240134 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.240160 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.240185 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.240238 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.240263 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.240288 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.240316 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.240341 4735 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.240365 4735 reconstruct.go:97] "Volume reconstruction finished" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.240383 4735 reconciler.go:26] "Reconciler: start to sync state" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.256332 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.258636 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.258690 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.258703 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.260188 4735 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.260208 4735 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.260230 4735 state_mem.go:36] "Initialized new in-memory state store" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.268286 4735 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.270757 4735 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.270825 4735 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.270908 4735 kubelet.go:2335] "Starting kubelet main sync loop" Feb 23 00:07:22 crc kubenswrapper[4735]: E0223 00:07:22.271143 4735 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.272178 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Feb 23 00:07:22 crc kubenswrapper[4735]: E0223 00:07:22.272245 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.279627 4735 policy_none.go:49] "None policy: Start" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.281477 4735 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.281514 4735 state_mem.go:35] "Initializing new in-memory state store" Feb 23 00:07:22 crc kubenswrapper[4735]: E0223 00:07:22.297831 4735 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.353344 4735 manager.go:334] "Starting Device Plugin manager" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.353737 4735 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.353759 4735 server.go:79] "Starting device plugin registration server" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.354344 4735 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.354366 4735 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.354996 4735 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.355095 4735 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.355105 4735 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 23 00:07:22 crc kubenswrapper[4735]: E0223 00:07:22.364355 4735 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.371746 4735 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.371862 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.372956 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.373000 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.373012 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.373171 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.373374 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.373450 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.374085 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.374108 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.374117 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.374227 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.374551 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.374638 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.375146 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.375177 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.375186 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.375346 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.375431 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.375483 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.375795 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.375974 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.376040 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.376126 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.376171 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.376192 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.377488 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.377544 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.377560 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.377575 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.377619 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.377642 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.377820 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.378203 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.378404 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.379524 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.379583 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.379617 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.380222 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.380284 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.380299 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.380408 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.380382 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.382026 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.382067 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.382084 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:22 crc kubenswrapper[4735]: E0223 00:07:22.407386 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="400ms" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.443181 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.443230 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.443254 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.443276 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.443296 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.443344 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.443390 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.443475 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.443549 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.443573 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.443592 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.443624 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.443713 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.443789 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.443825 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.454840 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.457517 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.457589 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.457609 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.457649 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 00:07:22 crc kubenswrapper[4735]: E0223 00:07:22.458238 4735 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.213:6443: connect: connection refused" node="crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.545595 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.545659 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.545691 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.545722 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.545755 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.545786 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.545816 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.545890 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.545938 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.545973 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.546002 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.546032 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.546063 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.546129 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.546173 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.546287 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.546365 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.546304 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.546461 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.546467 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.546509 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.546544 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.546564 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.546618 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.546619 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.546646 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.546698 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.546718 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.546765 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.546768 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.658408 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.659818 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.659880 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.659890 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.659921 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 00:07:22 crc kubenswrapper[4735]: E0223 00:07:22.660438 4735 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.213:6443: connect: connection refused" node="crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.705763 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.724162 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.748549 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-c4f1ca094309814d9fcae918a2e7790903c174e5485bd1505ed95d8f8f7fb462 WatchSource:0}: Error finding container c4f1ca094309814d9fcae918a2e7790903c174e5485bd1505ed95d8f8f7fb462: Status 404 returned error can't find the container with id c4f1ca094309814d9fcae918a2e7790903c174e5485bd1505ed95d8f8f7fb462 Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.752389 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.761208 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: I0223 00:07:22.767321 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.787775 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-9b1ef4800e028e201f2d1c94957cb0cd389e0b9f952396dcbff97049464c0029 WatchSource:0}: Error finding container 9b1ef4800e028e201f2d1c94957cb0cd389e0b9f952396dcbff97049464c0029: Status 404 returned error can't find the container with id 9b1ef4800e028e201f2d1c94957cb0cd389e0b9f952396dcbff97049464c0029 Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.790441 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-004ecc0a405445089ec2aa124099154f449a30bc8dd12a665e4e098d93471a94 WatchSource:0}: Error finding container 004ecc0a405445089ec2aa124099154f449a30bc8dd12a665e4e098d93471a94: Status 404 returned error can't find the container with id 004ecc0a405445089ec2aa124099154f449a30bc8dd12a665e4e098d93471a94 Feb 23 00:07:22 crc kubenswrapper[4735]: W0223 00:07:22.799176 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-b375367694310b66dc459d0f84310a24350eec5a8a6f71979735ada80c843593 WatchSource:0}: Error finding container b375367694310b66dc459d0f84310a24350eec5a8a6f71979735ada80c843593: Status 404 returned error can't find the container with id b375367694310b66dc459d0f84310a24350eec5a8a6f71979735ada80c843593 Feb 23 00:07:22 crc kubenswrapper[4735]: E0223 00:07:22.811722 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="800ms" Feb 23 00:07:23 crc kubenswrapper[4735]: I0223 00:07:23.061277 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:23 crc kubenswrapper[4735]: I0223 00:07:23.063659 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:23 crc kubenswrapper[4735]: I0223 00:07:23.063732 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:23 crc kubenswrapper[4735]: I0223 00:07:23.063754 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:23 crc kubenswrapper[4735]: I0223 00:07:23.063802 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 00:07:23 crc kubenswrapper[4735]: E0223 00:07:23.064626 4735 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.213:6443: connect: connection refused" node="crc" Feb 23 00:07:23 crc kubenswrapper[4735]: I0223 00:07:23.197123 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 16:36:42.330934038 +0000 UTC Feb 23 00:07:23 crc kubenswrapper[4735]: I0223 00:07:23.197662 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Feb 23 00:07:23 crc kubenswrapper[4735]: I0223 00:07:23.279041 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a837942e7bd497e773deeaa226ef15cc31403971076b983e2bb8edb2832c4e01"} Feb 23 00:07:23 crc kubenswrapper[4735]: I0223 00:07:23.280287 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"c4f1ca094309814d9fcae918a2e7790903c174e5485bd1505ed95d8f8f7fb462"} Feb 23 00:07:23 crc kubenswrapper[4735]: I0223 00:07:23.282376 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b375367694310b66dc459d0f84310a24350eec5a8a6f71979735ada80c843593"} Feb 23 00:07:23 crc kubenswrapper[4735]: I0223 00:07:23.283649 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"004ecc0a405445089ec2aa124099154f449a30bc8dd12a665e4e098d93471a94"} Feb 23 00:07:23 crc kubenswrapper[4735]: I0223 00:07:23.284938 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"9b1ef4800e028e201f2d1c94957cb0cd389e0b9f952396dcbff97049464c0029"} Feb 23 00:07:23 crc kubenswrapper[4735]: W0223 00:07:23.397120 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Feb 23 00:07:23 crc kubenswrapper[4735]: E0223 00:07:23.397493 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Feb 23 00:07:23 crc kubenswrapper[4735]: W0223 00:07:23.523314 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Feb 23 00:07:23 crc kubenswrapper[4735]: E0223 00:07:23.523414 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Feb 23 00:07:23 crc kubenswrapper[4735]: W0223 00:07:23.566989 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Feb 23 00:07:23 crc kubenswrapper[4735]: E0223 00:07:23.567148 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Feb 23 00:07:23 crc kubenswrapper[4735]: E0223 00:07:23.613367 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="1.6s" Feb 23 00:07:23 crc kubenswrapper[4735]: W0223 00:07:23.720131 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Feb 23 00:07:23 crc kubenswrapper[4735]: E0223 00:07:23.720239 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Feb 23 00:07:23 crc kubenswrapper[4735]: I0223 00:07:23.865802 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:23 crc kubenswrapper[4735]: I0223 00:07:23.867498 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:23 crc kubenswrapper[4735]: I0223 00:07:23.867560 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:23 crc kubenswrapper[4735]: I0223 00:07:23.867578 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:23 crc kubenswrapper[4735]: I0223 00:07:23.867612 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 00:07:23 crc kubenswrapper[4735]: E0223 00:07:23.868160 4735 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.213:6443: connect: connection refused" node="crc" Feb 23 00:07:24 crc kubenswrapper[4735]: I0223 00:07:24.108078 4735 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 23 00:07:24 crc kubenswrapper[4735]: E0223 00:07:24.109742 4735 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Feb 23 00:07:24 crc kubenswrapper[4735]: I0223 00:07:24.197562 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 23:12:01.874438148 +0000 UTC Feb 23 00:07:24 crc kubenswrapper[4735]: I0223 00:07:24.197815 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Feb 23 00:07:24 crc kubenswrapper[4735]: I0223 00:07:24.292929 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df"} Feb 23 00:07:24 crc kubenswrapper[4735]: I0223 00:07:24.293009 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437"} Feb 23 00:07:24 crc kubenswrapper[4735]: I0223 00:07:24.293055 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406"} Feb 23 00:07:24 crc kubenswrapper[4735]: I0223 00:07:24.295487 4735 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="4fd35aa0556235dde90cf7c6f1750c98352ccdfc64f6d4b9acd35efed020aeb5" exitCode=0 Feb 23 00:07:24 crc kubenswrapper[4735]: I0223 00:07:24.295577 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"4fd35aa0556235dde90cf7c6f1750c98352ccdfc64f6d4b9acd35efed020aeb5"} Feb 23 00:07:24 crc kubenswrapper[4735]: I0223 00:07:24.295614 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:24 crc kubenswrapper[4735]: I0223 00:07:24.297168 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:24 crc kubenswrapper[4735]: I0223 00:07:24.297234 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:24 crc kubenswrapper[4735]: I0223 00:07:24.297254 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:24 crc kubenswrapper[4735]: I0223 00:07:24.297463 4735 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10" exitCode=0 Feb 23 00:07:24 crc kubenswrapper[4735]: I0223 00:07:24.297525 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10"} Feb 23 00:07:24 crc kubenswrapper[4735]: I0223 00:07:24.297571 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:24 crc kubenswrapper[4735]: I0223 00:07:24.298341 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:24 crc kubenswrapper[4735]: I0223 00:07:24.298368 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:24 crc kubenswrapper[4735]: I0223 00:07:24.298378 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:24 crc kubenswrapper[4735]: I0223 00:07:24.299812 4735 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="880ab7a6b4768fded1fc53121addf4b591e68d18801561691f449f38565c6e2c" exitCode=0 Feb 23 00:07:24 crc kubenswrapper[4735]: I0223 00:07:24.299891 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:24 crc kubenswrapper[4735]: I0223 00:07:24.299908 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"880ab7a6b4768fded1fc53121addf4b591e68d18801561691f449f38565c6e2c"} Feb 23 00:07:24 crc kubenswrapper[4735]: I0223 00:07:24.300782 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:24 crc kubenswrapper[4735]: I0223 00:07:24.301332 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:24 crc kubenswrapper[4735]: I0223 00:07:24.301344 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:24 crc kubenswrapper[4735]: I0223 00:07:24.302247 4735 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676" exitCode=0 Feb 23 00:07:24 crc kubenswrapper[4735]: I0223 00:07:24.302279 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676"} Feb 23 00:07:24 crc kubenswrapper[4735]: I0223 00:07:24.302330 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:24 crc kubenswrapper[4735]: I0223 00:07:24.303378 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:24 crc kubenswrapper[4735]: I0223 00:07:24.303402 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:24 crc kubenswrapper[4735]: I0223 00:07:24.303411 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:24 crc kubenswrapper[4735]: I0223 00:07:24.304538 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:24 crc kubenswrapper[4735]: I0223 00:07:24.306728 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:24 crc kubenswrapper[4735]: I0223 00:07:24.306783 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:24 crc kubenswrapper[4735]: I0223 00:07:24.306800 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:25 crc kubenswrapper[4735]: W0223 00:07:25.174834 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Feb 23 00:07:25 crc kubenswrapper[4735]: E0223 00:07:25.175055 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Feb 23 00:07:25 crc kubenswrapper[4735]: I0223 00:07:25.197517 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Feb 23 00:07:25 crc kubenswrapper[4735]: I0223 00:07:25.198690 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 12:44:16.403902789 +0000 UTC Feb 23 00:07:25 crc kubenswrapper[4735]: E0223 00:07:25.216725 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="3.2s" Feb 23 00:07:25 crc kubenswrapper[4735]: I0223 00:07:25.309437 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3"} Feb 23 00:07:25 crc kubenswrapper[4735]: I0223 00:07:25.309524 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765"} Feb 23 00:07:25 crc kubenswrapper[4735]: I0223 00:07:25.309536 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549"} Feb 23 00:07:25 crc kubenswrapper[4735]: I0223 00:07:25.309545 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f"} Feb 23 00:07:25 crc kubenswrapper[4735]: I0223 00:07:25.314136 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e"} Feb 23 00:07:25 crc kubenswrapper[4735]: I0223 00:07:25.314237 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:25 crc kubenswrapper[4735]: I0223 00:07:25.315409 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:25 crc kubenswrapper[4735]: I0223 00:07:25.315450 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:25 crc kubenswrapper[4735]: I0223 00:07:25.315465 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:25 crc kubenswrapper[4735]: I0223 00:07:25.317765 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6b73257ceeafdfe417dd887f970b9f16ddd188d8ed38699ec89189e451698539"} Feb 23 00:07:25 crc kubenswrapper[4735]: I0223 00:07:25.317935 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6722b48f73153052807e8d89e0ef3bbada7ac84740e56c10f8767509b55ab5c3"} Feb 23 00:07:25 crc kubenswrapper[4735]: I0223 00:07:25.318070 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4dd0741d00d65832abe165aed347ca8633cb72d8d3395922dfa5615527f37d55"} Feb 23 00:07:25 crc kubenswrapper[4735]: I0223 00:07:25.317983 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:25 crc kubenswrapper[4735]: I0223 00:07:25.319220 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:25 crc kubenswrapper[4735]: I0223 00:07:25.319268 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:25 crc kubenswrapper[4735]: I0223 00:07:25.319284 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:25 crc kubenswrapper[4735]: I0223 00:07:25.320002 4735 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97" exitCode=0 Feb 23 00:07:25 crc kubenswrapper[4735]: I0223 00:07:25.320081 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:25 crc kubenswrapper[4735]: I0223 00:07:25.320104 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97"} Feb 23 00:07:25 crc kubenswrapper[4735]: I0223 00:07:25.321915 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:25 crc kubenswrapper[4735]: I0223 00:07:25.321960 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:25 crc kubenswrapper[4735]: I0223 00:07:25.321979 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:25 crc kubenswrapper[4735]: I0223 00:07:25.324464 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"8717b726bbca97df0f82ec42ac1d3c84d9fa009f79ed5dbd4b4a0bc2ae54c896"} Feb 23 00:07:25 crc kubenswrapper[4735]: I0223 00:07:25.324538 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:25 crc kubenswrapper[4735]: I0223 00:07:25.326487 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:25 crc kubenswrapper[4735]: I0223 00:07:25.326642 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:25 crc kubenswrapper[4735]: I0223 00:07:25.326768 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:25 crc kubenswrapper[4735]: I0223 00:07:25.468683 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:25 crc kubenswrapper[4735]: I0223 00:07:25.469790 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:25 crc kubenswrapper[4735]: I0223 00:07:25.469818 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:25 crc kubenswrapper[4735]: I0223 00:07:25.469827 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:25 crc kubenswrapper[4735]: I0223 00:07:25.469862 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 00:07:25 crc kubenswrapper[4735]: E0223 00:07:25.470296 4735 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.213:6443: connect: connection refused" node="crc" Feb 23 00:07:25 crc kubenswrapper[4735]: E0223 00:07:25.471625 4735 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.213:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1896b77af7d9c33a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-23 00:07:22.189005626 +0000 UTC m=+0.652551637,LastTimestamp:2026-02-23 00:07:22.189005626 +0000 UTC m=+0.652551637,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 23 00:07:25 crc kubenswrapper[4735]: W0223 00:07:25.504778 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.213:6443: connect: connection refused Feb 23 00:07:25 crc kubenswrapper[4735]: E0223 00:07:25.504884 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.213:6443: connect: connection refused" logger="UnhandledError" Feb 23 00:07:26 crc kubenswrapper[4735]: I0223 00:07:26.199154 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 21:31:00.455037609 +0000 UTC Feb 23 00:07:26 crc kubenswrapper[4735]: I0223 00:07:26.335472 4735 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea" exitCode=0 Feb 23 00:07:26 crc kubenswrapper[4735]: I0223 00:07:26.335590 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea"} Feb 23 00:07:26 crc kubenswrapper[4735]: I0223 00:07:26.335923 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:26 crc kubenswrapper[4735]: I0223 00:07:26.338166 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:26 crc kubenswrapper[4735]: I0223 00:07:26.338256 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:26 crc kubenswrapper[4735]: I0223 00:07:26.338284 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:26 crc kubenswrapper[4735]: I0223 00:07:26.341834 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:26 crc kubenswrapper[4735]: I0223 00:07:26.342094 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8"} Feb 23 00:07:26 crc kubenswrapper[4735]: I0223 00:07:26.342232 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:26 crc kubenswrapper[4735]: I0223 00:07:26.342271 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:26 crc kubenswrapper[4735]: I0223 00:07:26.343428 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:26 crc kubenswrapper[4735]: I0223 00:07:26.343495 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:26 crc kubenswrapper[4735]: I0223 00:07:26.343521 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:26 crc kubenswrapper[4735]: I0223 00:07:26.343566 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:26 crc kubenswrapper[4735]: I0223 00:07:26.343963 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:26 crc kubenswrapper[4735]: I0223 00:07:26.344019 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:26 crc kubenswrapper[4735]: I0223 00:07:26.344045 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:26 crc kubenswrapper[4735]: I0223 00:07:26.344014 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 00:07:26 crc kubenswrapper[4735]: I0223 00:07:26.344162 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:26 crc kubenswrapper[4735]: I0223 00:07:26.344208 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:26 crc kubenswrapper[4735]: I0223 00:07:26.344227 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:26 crc kubenswrapper[4735]: I0223 00:07:26.345487 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:26 crc kubenswrapper[4735]: I0223 00:07:26.345524 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:26 crc kubenswrapper[4735]: I0223 00:07:26.345542 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:27 crc kubenswrapper[4735]: I0223 00:07:27.199535 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 17:47:25.126653951 +0000 UTC Feb 23 00:07:27 crc kubenswrapper[4735]: I0223 00:07:27.350496 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88"} Feb 23 00:07:27 crc kubenswrapper[4735]: I0223 00:07:27.350571 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962"} Feb 23 00:07:27 crc kubenswrapper[4735]: I0223 00:07:27.350606 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd"} Feb 23 00:07:27 crc kubenswrapper[4735]: I0223 00:07:27.350581 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:27 crc kubenswrapper[4735]: I0223 00:07:27.350694 4735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 00:07:27 crc kubenswrapper[4735]: I0223 00:07:27.350757 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:27 crc kubenswrapper[4735]: I0223 00:07:27.352336 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:27 crc kubenswrapper[4735]: I0223 00:07:27.352374 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:27 crc kubenswrapper[4735]: I0223 00:07:27.352385 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:27 crc kubenswrapper[4735]: I0223 00:07:27.353178 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:27 crc kubenswrapper[4735]: I0223 00:07:27.353237 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:27 crc kubenswrapper[4735]: I0223 00:07:27.353256 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:27 crc kubenswrapper[4735]: I0223 00:07:27.559057 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 00:07:27 crc kubenswrapper[4735]: I0223 00:07:27.559273 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:27 crc kubenswrapper[4735]: I0223 00:07:27.561075 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:27 crc kubenswrapper[4735]: I0223 00:07:27.561113 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:27 crc kubenswrapper[4735]: I0223 00:07:27.561122 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:27 crc kubenswrapper[4735]: I0223 00:07:27.616578 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 00:07:28 crc kubenswrapper[4735]: I0223 00:07:28.200627 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 19:29:04.656696167 +0000 UTC Feb 23 00:07:28 crc kubenswrapper[4735]: I0223 00:07:28.359699 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:28 crc kubenswrapper[4735]: I0223 00:07:28.359938 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca"} Feb 23 00:07:28 crc kubenswrapper[4735]: I0223 00:07:28.360057 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20"} Feb 23 00:07:28 crc kubenswrapper[4735]: I0223 00:07:28.360094 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 00:07:28 crc kubenswrapper[4735]: I0223 00:07:28.359961 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:28 crc kubenswrapper[4735]: I0223 00:07:28.361690 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:28 crc kubenswrapper[4735]: I0223 00:07:28.361742 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:28 crc kubenswrapper[4735]: I0223 00:07:28.361700 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:28 crc kubenswrapper[4735]: I0223 00:07:28.361803 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:28 crc kubenswrapper[4735]: I0223 00:07:28.361761 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:28 crc kubenswrapper[4735]: I0223 00:07:28.361817 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:28 crc kubenswrapper[4735]: I0223 00:07:28.396763 4735 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 23 00:07:28 crc kubenswrapper[4735]: I0223 00:07:28.671071 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:28 crc kubenswrapper[4735]: I0223 00:07:28.673052 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:28 crc kubenswrapper[4735]: I0223 00:07:28.673119 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:28 crc kubenswrapper[4735]: I0223 00:07:28.673137 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:28 crc kubenswrapper[4735]: I0223 00:07:28.673216 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 00:07:29 crc kubenswrapper[4735]: I0223 00:07:29.116699 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:07:29 crc kubenswrapper[4735]: I0223 00:07:29.116993 4735 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 23 00:07:29 crc kubenswrapper[4735]: I0223 00:07:29.117050 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:29 crc kubenswrapper[4735]: I0223 00:07:29.118655 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:29 crc kubenswrapper[4735]: I0223 00:07:29.118734 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:29 crc kubenswrapper[4735]: I0223 00:07:29.118752 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:29 crc kubenswrapper[4735]: I0223 00:07:29.201342 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 11:23:32.65738323 +0000 UTC Feb 23 00:07:29 crc kubenswrapper[4735]: I0223 00:07:29.227577 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:07:29 crc kubenswrapper[4735]: I0223 00:07:29.231812 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:07:29 crc kubenswrapper[4735]: I0223 00:07:29.362220 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:29 crc kubenswrapper[4735]: I0223 00:07:29.362297 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:29 crc kubenswrapper[4735]: I0223 00:07:29.362315 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:29 crc kubenswrapper[4735]: I0223 00:07:29.363818 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:29 crc kubenswrapper[4735]: I0223 00:07:29.363893 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:29 crc kubenswrapper[4735]: I0223 00:07:29.363911 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:29 crc kubenswrapper[4735]: I0223 00:07:29.364086 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:29 crc kubenswrapper[4735]: I0223 00:07:29.364140 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:29 crc kubenswrapper[4735]: I0223 00:07:29.364154 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:29 crc kubenswrapper[4735]: I0223 00:07:29.364875 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:29 crc kubenswrapper[4735]: I0223 00:07:29.364922 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:29 crc kubenswrapper[4735]: I0223 00:07:29.364935 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:29 crc kubenswrapper[4735]: I0223 00:07:29.722493 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 00:07:30 crc kubenswrapper[4735]: I0223 00:07:30.067557 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 23 00:07:30 crc kubenswrapper[4735]: I0223 00:07:30.135176 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 23 00:07:30 crc kubenswrapper[4735]: I0223 00:07:30.201468 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 11:39:29.69775419 +0000 UTC Feb 23 00:07:30 crc kubenswrapper[4735]: I0223 00:07:30.365875 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:30 crc kubenswrapper[4735]: I0223 00:07:30.365949 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:30 crc kubenswrapper[4735]: I0223 00:07:30.365988 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:30 crc kubenswrapper[4735]: I0223 00:07:30.368185 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:30 crc kubenswrapper[4735]: I0223 00:07:30.368214 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:30 crc kubenswrapper[4735]: I0223 00:07:30.368249 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:30 crc kubenswrapper[4735]: I0223 00:07:30.368204 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:30 crc kubenswrapper[4735]: I0223 00:07:30.368300 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:30 crc kubenswrapper[4735]: I0223 00:07:30.368322 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:30 crc kubenswrapper[4735]: I0223 00:07:30.368269 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:30 crc kubenswrapper[4735]: I0223 00:07:30.368434 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:30 crc kubenswrapper[4735]: I0223 00:07:30.368274 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:31 crc kubenswrapper[4735]: I0223 00:07:31.201665 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 07:42:46.905778318 +0000 UTC Feb 23 00:07:31 crc kubenswrapper[4735]: I0223 00:07:31.367566 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:31 crc kubenswrapper[4735]: I0223 00:07:31.368489 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:31 crc kubenswrapper[4735]: I0223 00:07:31.368552 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:31 crc kubenswrapper[4735]: I0223 00:07:31.368571 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:32 crc kubenswrapper[4735]: I0223 00:07:32.202242 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 08:53:24.982819786 +0000 UTC Feb 23 00:07:32 crc kubenswrapper[4735]: E0223 00:07:32.364606 4735 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 23 00:07:33 crc kubenswrapper[4735]: I0223 00:07:33.202583 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 04:07:59.913172764 +0000 UTC Feb 23 00:07:34 crc kubenswrapper[4735]: I0223 00:07:34.187324 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 00:07:34 crc kubenswrapper[4735]: I0223 00:07:34.187498 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:34 crc kubenswrapper[4735]: I0223 00:07:34.188817 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:34 crc kubenswrapper[4735]: I0223 00:07:34.188844 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:34 crc kubenswrapper[4735]: I0223 00:07:34.188883 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:34 crc kubenswrapper[4735]: I0223 00:07:34.203155 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 04:48:51.43466874 +0000 UTC Feb 23 00:07:34 crc kubenswrapper[4735]: I0223 00:07:34.284606 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 00:07:34 crc kubenswrapper[4735]: I0223 00:07:34.377290 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:34 crc kubenswrapper[4735]: I0223 00:07:34.378769 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:34 crc kubenswrapper[4735]: I0223 00:07:34.378830 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:34 crc kubenswrapper[4735]: I0223 00:07:34.378879 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:35 crc kubenswrapper[4735]: I0223 00:07:35.204049 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-06 18:37:54.106704359 +0000 UTC Feb 23 00:07:35 crc kubenswrapper[4735]: W0223 00:07:35.812876 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 23 00:07:35 crc kubenswrapper[4735]: I0223 00:07:35.812998 4735 trace.go:236] Trace[51757076]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (23-Feb-2026 00:07:25.811) (total time: 10001ms): Feb 23 00:07:35 crc kubenswrapper[4735]: Trace[51757076]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (00:07:35.812) Feb 23 00:07:35 crc kubenswrapper[4735]: Trace[51757076]: [10.001530444s] [10.001530444s] END Feb 23 00:07:35 crc kubenswrapper[4735]: E0223 00:07:35.813031 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 23 00:07:36 crc kubenswrapper[4735]: W0223 00:07:36.126713 4735 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 23 00:07:36 crc kubenswrapper[4735]: I0223 00:07:36.126843 4735 trace.go:236] Trace[1401864195]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (23-Feb-2026 00:07:26.124) (total time: 10001ms): Feb 23 00:07:36 crc kubenswrapper[4735]: Trace[1401864195]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (00:07:36.126) Feb 23 00:07:36 crc kubenswrapper[4735]: Trace[1401864195]: [10.001886484s] [10.001886484s] END Feb 23 00:07:36 crc kubenswrapper[4735]: E0223 00:07:36.126926 4735 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 23 00:07:36 crc kubenswrapper[4735]: I0223 00:07:36.198358 4735 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 23 00:07:36 crc kubenswrapper[4735]: I0223 00:07:36.204482 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 02:11:09.486207969 +0000 UTC Feb 23 00:07:36 crc kubenswrapper[4735]: I0223 00:07:36.662090 4735 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 23 00:07:36 crc kubenswrapper[4735]: I0223 00:07:36.662253 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 23 00:07:36 crc kubenswrapper[4735]: I0223 00:07:36.667549 4735 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 23 00:07:36 crc kubenswrapper[4735]: I0223 00:07:36.667621 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 23 00:07:37 crc kubenswrapper[4735]: I0223 00:07:37.205150 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 13:18:01.54887636 +0000 UTC Feb 23 00:07:37 crc kubenswrapper[4735]: I0223 00:07:37.284597 4735 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 00:07:37 crc kubenswrapper[4735]: I0223 00:07:37.284691 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 00:07:38 crc kubenswrapper[4735]: I0223 00:07:38.206207 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 15:12:16.337731 +0000 UTC Feb 23 00:07:39 crc kubenswrapper[4735]: I0223 00:07:39.125253 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:07:39 crc kubenswrapper[4735]: I0223 00:07:39.125463 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:39 crc kubenswrapper[4735]: I0223 00:07:39.127223 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:39 crc kubenswrapper[4735]: I0223 00:07:39.127272 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:39 crc kubenswrapper[4735]: I0223 00:07:39.127290 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:39 crc kubenswrapper[4735]: I0223 00:07:39.132267 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:07:39 crc kubenswrapper[4735]: I0223 00:07:39.206767 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 17:13:07.067371622 +0000 UTC Feb 23 00:07:39 crc kubenswrapper[4735]: I0223 00:07:39.390463 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:39 crc kubenswrapper[4735]: I0223 00:07:39.391885 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:39 crc kubenswrapper[4735]: I0223 00:07:39.391937 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:39 crc kubenswrapper[4735]: I0223 00:07:39.391954 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:39 crc kubenswrapper[4735]: I0223 00:07:39.476061 4735 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 23 00:07:40 crc kubenswrapper[4735]: I0223 00:07:40.108545 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 23 00:07:40 crc kubenswrapper[4735]: I0223 00:07:40.129717 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 23 00:07:40 crc kubenswrapper[4735]: I0223 00:07:40.185964 4735 apiserver.go:52] "Watching apiserver" Feb 23 00:07:40 crc kubenswrapper[4735]: I0223 00:07:40.192389 4735 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 23 00:07:40 crc kubenswrapper[4735]: I0223 00:07:40.193737 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-etcd/etcd-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Feb 23 00:07:40 crc kubenswrapper[4735]: I0223 00:07:40.195346 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 23 00:07:40 crc kubenswrapper[4735]: I0223 00:07:40.195780 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:07:40 crc kubenswrapper[4735]: I0223 00:07:40.195943 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 00:07:40 crc kubenswrapper[4735]: E0223 00:07:40.196064 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:07:40 crc kubenswrapper[4735]: I0223 00:07:40.196148 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:07:40 crc kubenswrapper[4735]: I0223 00:07:40.196190 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 23 00:07:40 crc kubenswrapper[4735]: I0223 00:07:40.196225 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:07:40 crc kubenswrapper[4735]: E0223 00:07:40.196272 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:07:40 crc kubenswrapper[4735]: E0223 00:07:40.197049 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:07:40 crc kubenswrapper[4735]: I0223 00:07:40.198395 4735 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 23 00:07:40 crc kubenswrapper[4735]: I0223 00:07:40.199095 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 23 00:07:40 crc kubenswrapper[4735]: I0223 00:07:40.199140 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 23 00:07:40 crc kubenswrapper[4735]: I0223 00:07:40.199521 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 23 00:07:40 crc kubenswrapper[4735]: I0223 00:07:40.199586 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 23 00:07:40 crc kubenswrapper[4735]: I0223 00:07:40.200393 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 23 00:07:40 crc kubenswrapper[4735]: I0223 00:07:40.200478 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 23 00:07:40 crc kubenswrapper[4735]: I0223 00:07:40.200920 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 23 00:07:40 crc kubenswrapper[4735]: I0223 00:07:40.203557 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 23 00:07:40 crc kubenswrapper[4735]: I0223 00:07:40.204127 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 23 00:07:40 crc kubenswrapper[4735]: I0223 00:07:40.207337 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 16:11:51.809085732 +0000 UTC Feb 23 00:07:40 crc kubenswrapper[4735]: I0223 00:07:40.257742 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 00:07:40 crc kubenswrapper[4735]: I0223 00:07:40.283220 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 00:07:40 crc kubenswrapper[4735]: I0223 00:07:40.299139 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 00:07:40 crc kubenswrapper[4735]: I0223 00:07:40.315904 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 00:07:40 crc kubenswrapper[4735]: I0223 00:07:40.333544 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 00:07:40 crc kubenswrapper[4735]: I0223 00:07:40.347480 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 00:07:40 crc kubenswrapper[4735]: I0223 00:07:40.360471 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 00:07:40 crc kubenswrapper[4735]: I0223 00:07:40.374273 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 00:07:40 crc kubenswrapper[4735]: I0223 00:07:40.389486 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 00:07:40 crc kubenswrapper[4735]: E0223 00:07:40.399880 4735 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Feb 23 00:07:40 crc kubenswrapper[4735]: I0223 00:07:40.403755 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 00:07:40 crc kubenswrapper[4735]: I0223 00:07:40.432600 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.187435 4735 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.207466 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 10:42:29.503352905 +0000 UTC Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.271768 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:07:41 crc kubenswrapper[4735]: E0223 00:07:41.272014 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:07:41 crc kubenswrapper[4735]: E0223 00:07:41.657832 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.659700 4735 trace.go:236] Trace[1717376175]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (23-Feb-2026 00:07:30.185) (total time: 11473ms): Feb 23 00:07:41 crc kubenswrapper[4735]: Trace[1717376175]: ---"Objects listed" error: 11473ms (00:07:41.659) Feb 23 00:07:41 crc kubenswrapper[4735]: Trace[1717376175]: [11.473856465s] [11.473856465s] END Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.659974 4735 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.660827 4735 trace.go:236] Trace[1928967736]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (23-Feb-2026 00:07:29.645) (total time: 12015ms): Feb 23 00:07:41 crc kubenswrapper[4735]: Trace[1928967736]: ---"Objects listed" error: 12015ms (00:07:41.660) Feb 23 00:07:41 crc kubenswrapper[4735]: Trace[1928967736]: [12.01542728s] [12.01542728s] END Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.660910 4735 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.662280 4735 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 23 00:07:41 crc kubenswrapper[4735]: E0223 00:07:41.663351 4735 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.674768 4735 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.712440 4735 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:57512->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.712524 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:57512->192.168.126.11:17697: read: connection reset by peer" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.713037 4735 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.713082 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.713436 4735 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.713507 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.763019 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.763088 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.763124 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.763160 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.763194 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.763227 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.763298 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.763356 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.763389 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.763423 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.763491 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.763703 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.763949 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.763990 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.764027 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.764060 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.764095 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.764140 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.764190 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.764266 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.764311 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.764353 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.764385 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.764422 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.764412 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.764501 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.764545 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.764579 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.764615 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.764649 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.764685 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.764731 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.764768 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.764809 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.764841 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.764840 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.764906 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.764939 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.764969 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.764999 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.765032 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.765154 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.765502 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.765550 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.765489 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.765648 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.765747 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.766185 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.766217 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.766330 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.766512 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.766569 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.766628 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.766688 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.766845 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.766999 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.767354 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.767825 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.767889 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.767946 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.768255 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.768297 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.768333 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.768365 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.768400 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.768433 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.768466 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.768527 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.768562 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.768603 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.768635 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.768667 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.768702 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.768736 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.768767 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.768799 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.768829 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.768898 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.769011 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.769087 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.769150 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.769204 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.769271 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.769321 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.769380 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.769615 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.769659 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.769791 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.769832 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.769880 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.769939 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.769949 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.769997 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.770051 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.770108 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.770164 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.770213 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.770263 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.770319 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.770365 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.770417 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.770468 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.770515 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.770564 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.770614 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.770659 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.770709 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.770759 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.770810 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.770897 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.770955 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.771012 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.771066 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.771119 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.771168 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.771908 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.771969 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.772022 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.772077 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.772132 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.772187 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.772246 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.772310 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.772366 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.772424 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.772491 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.772545 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.772597 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.772654 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.772707 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.772762 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.772816 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.772923 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.772982 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.773039 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.773098 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.773152 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.773283 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.773341 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.773398 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.773458 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.769987 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.770078 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.770277 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.770480 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.770668 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.770758 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.770820 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.772226 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.772195 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.772438 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.772443 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.772590 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.772681 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.772706 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.772897 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.773469 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.773721 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.774269 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.774530 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.774796 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.775556 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.775698 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.776459 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.777066 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.773514 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.778137 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.778186 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.778235 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.778276 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.778316 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.778351 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.778403 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.778453 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.778512 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.778551 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.778587 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.778621 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.778679 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.778712 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.778746 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.778777 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.778811 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.778880 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.778918 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.778976 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.779047 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.779103 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.779153 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.779203 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.779253 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.779307 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.779358 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.779407 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.779501 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.779556 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.779604 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.779653 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.779701 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.779751 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.779805 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.779901 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.779956 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.780430 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.780541 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.780600 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.780650 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.780703 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.780761 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.780798 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.780831 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.780903 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.780954 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.781000 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.781037 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.781071 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.781107 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.781156 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.781207 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.781240 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.781275 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.781311 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.781346 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.781380 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.781415 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.781448 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.781480 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.781604 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.781640 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.781677 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.781712 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.781745 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.781783 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.781819 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.781890 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.781942 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.781995 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.782037 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.782080 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.782116 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.782155 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.782189 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.782224 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.782257 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.782288 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.782321 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.782393 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.782433 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.782472 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.784080 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.778025 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.778393 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.778528 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.788146 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.779296 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.779734 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.779802 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.780270 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.780450 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.780506 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.780716 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.780992 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.782059 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.782533 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.782723 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.782981 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.783015 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.783068 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: E0223 00:07:41.783166 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:07:42.283138975 +0000 UTC m=+20.746684986 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.788573 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.788664 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.788783 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.788481 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.789038 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.789043 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.789086 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.789125 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.789165 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.789206 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.789240 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.789276 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.789316 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.789350 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.789382 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.789443 4735 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.789471 4735 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.789537 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.789571 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.789636 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.789664 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.789690 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.789713 4735 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.789734 4735 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.789754 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.789774 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.789794 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.789814 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.789835 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.789874 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.789895 4735 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.789919 4735 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.789938 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.790007 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.790030 4735 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.790050 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.790071 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.790092 4735 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.790111 4735 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.790134 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.790155 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.790174 4735 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.790195 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.790214 4735 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.790386 4735 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.790453 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.790476 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.790496 4735 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.790515 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.790535 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.790554 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.790574 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.790601 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.790621 4735 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.790643 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.790669 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.790688 4735 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.793393 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.793415 4735 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.793436 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.793456 4735 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.793477 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.793496 4735 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.793534 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.793555 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.793581 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.793602 4735 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.793623 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.794155 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.794180 4735 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.794200 4735 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.789937 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.790598 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.783768 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.783782 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.784944 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.784981 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: E0223 00:07:41.785431 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 00:07:41 crc kubenswrapper[4735]: E0223 00:07:41.794392 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 00:07:42.294365673 +0000 UTC m=+20.757911684 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.794430 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: E0223 00:07:41.794515 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 00:07:41 crc kubenswrapper[4735]: E0223 00:07:41.794570 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 00:07:42.294554939 +0000 UTC m=+20.758100950 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.795317 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.795660 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.795828 4735 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.796210 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.783549 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.786112 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.786145 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.786383 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.786447 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.786436 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.787051 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.787086 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.787843 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.787944 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.788347 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.790636 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.790925 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.791484 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.791520 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.791594 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.792014 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.792123 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.792305 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.792535 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.792678 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.792911 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.793309 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.793530 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.793937 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.796676 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.796895 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.796912 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.797336 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.797411 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.797611 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.797686 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.797721 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.797744 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.797754 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.796316 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.798221 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.798447 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.798624 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.799045 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.799306 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.799352 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.799609 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.800959 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.802305 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.785725 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.807655 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.807979 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.808569 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.808821 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.808878 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.823296 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.824053 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.824270 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.824317 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.824793 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.824790 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.825109 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.825318 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.825617 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.825784 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: E0223 00:07:41.826234 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.826259 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: E0223 00:07:41.826286 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 00:07:41 crc kubenswrapper[4735]: E0223 00:07:41.826346 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 00:07:41 crc kubenswrapper[4735]: E0223 00:07:41.826379 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 00:07:41 crc kubenswrapper[4735]: E0223 00:07:41.826397 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 00:07:41 crc kubenswrapper[4735]: E0223 00:07:41.826417 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.826443 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: E0223 00:07:41.826497 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-23 00:07:42.326479977 +0000 UTC m=+20.790025958 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.826503 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.826529 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: E0223 00:07:41.826543 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-23 00:07:42.326510988 +0000 UTC m=+20.790056969 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.826954 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.826998 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.827195 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.827235 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.827264 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.827744 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.828460 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.829028 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.829128 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.829137 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.830240 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.830299 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.830873 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.831415 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.831428 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.831822 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.832322 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.832486 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.832663 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.832778 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.832808 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.833334 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.833705 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.834596 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.835676 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.835621 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.835649 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.835740 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.836214 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.841702 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.841944 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.842099 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.842431 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.842740 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.843053 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.843074 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.843227 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.843296 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.843782 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.843826 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.843952 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.843994 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.844003 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.844098 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.845013 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.845096 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.846300 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.860329 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.867388 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.873732 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.880164 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895020 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895157 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895261 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895294 4735 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895315 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895341 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895366 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895439 4735 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895453 4735 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895446 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895467 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895537 4735 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895556 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895574 4735 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895588 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895602 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895616 4735 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895630 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895644 4735 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895659 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895673 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895688 4735 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895702 4735 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895717 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895731 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895745 4735 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895757 4735 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895770 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895784 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895796 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895809 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895822 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895838 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895920 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895934 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895948 4735 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895962 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895976 4735 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.895994 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896008 4735 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896024 4735 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896041 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896054 4735 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896068 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896082 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896096 4735 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896114 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896128 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896142 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896156 4735 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896177 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896191 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896205 4735 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896219 4735 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896234 4735 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896247 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896259 4735 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896274 4735 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896292 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896306 4735 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896320 4735 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896333 4735 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896349 4735 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896363 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896376 4735 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896389 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896402 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896415 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896429 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896443 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896456 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896470 4735 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896484 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896497 4735 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896513 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896526 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896540 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896554 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896567 4735 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896581 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896595 4735 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896609 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896623 4735 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896638 4735 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896652 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896667 4735 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896682 4735 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896695 4735 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896708 4735 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896722 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896735 4735 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896749 4735 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896762 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896776 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896791 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896804 4735 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896817 4735 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896829 4735 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896842 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896893 4735 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896905 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896918 4735 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896929 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896941 4735 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896956 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.896970 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897060 4735 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897088 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897105 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897117 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897128 4735 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897142 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897155 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897168 4735 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897181 4735 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897195 4735 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897209 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897221 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897235 4735 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897248 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897260 4735 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897272 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897285 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897299 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897312 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897325 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897338 4735 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897351 4735 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897364 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897377 4735 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897390 4735 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897404 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897420 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897433 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897446 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897459 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897473 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897485 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897496 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897510 4735 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897522 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897535 4735 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897547 4735 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:41 crc kubenswrapper[4735]: I0223 00:07:41.897560 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.019962 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.034037 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.046762 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 23 00:07:42 crc kubenswrapper[4735]: W0223 00:07:42.057463 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-5f69af2b0746e3436eb30a2f0fbb66ff1cf7b140e1f7ed9acf4b00ecb0f1aee6 WatchSource:0}: Error finding container 5f69af2b0746e3436eb30a2f0fbb66ff1cf7b140e1f7ed9acf4b00ecb0f1aee6: Status 404 returned error can't find the container with id 5f69af2b0746e3436eb30a2f0fbb66ff1cf7b140e1f7ed9acf4b00ecb0f1aee6 Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.208227 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 01:45:46.784707402 +0000 UTC Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.272000 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.272078 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:07:42 crc kubenswrapper[4735]: E0223 00:07:42.272131 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:07:42 crc kubenswrapper[4735]: E0223 00:07:42.272291 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.280628 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.281488 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.283323 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.284484 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.285437 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.286773 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.287753 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.288683 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.290445 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.292404 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.293730 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.296162 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.297450 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.297624 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.299779 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.300278 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.300406 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.300469 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:07:42 crc kubenswrapper[4735]: E0223 00:07:42.300638 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:07:43.30058223 +0000 UTC m=+21.764128211 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:07:42 crc kubenswrapper[4735]: E0223 00:07:42.300762 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 00:07:42 crc kubenswrapper[4735]: E0223 00:07:42.300845 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 00:07:43.300832437 +0000 UTC m=+21.764378418 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.300990 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: E0223 00:07:42.302282 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 00:07:42 crc kubenswrapper[4735]: E0223 00:07:42.302385 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 00:07:43.302359866 +0000 UTC m=+21.765905847 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.302635 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.303698 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.304243 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.305524 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.306412 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.307611 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.308387 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.309211 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.310972 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.311782 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.312654 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.313204 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.314812 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.316043 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.316886 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.318628 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.319565 4735 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.319727 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.321876 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.323136 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.323840 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.325124 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.328786 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.329884 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.331455 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.333209 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.335185 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.335904 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.336793 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.337721 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.339550 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.341643 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.342709 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.344734 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.345497 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.346653 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.347190 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.347736 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.348747 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.349298 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.349826 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.351004 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.351609 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.361744 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.377378 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.400261 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.401214 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.401307 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:07:42 crc kubenswrapper[4735]: E0223 00:07:42.401423 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 00:07:42 crc kubenswrapper[4735]: E0223 00:07:42.401444 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 00:07:42 crc kubenswrapper[4735]: E0223 00:07:42.401457 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 00:07:42 crc kubenswrapper[4735]: E0223 00:07:42.401506 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-23 00:07:43.401492549 +0000 UTC m=+21.865038520 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 00:07:42 crc kubenswrapper[4735]: E0223 00:07:42.401502 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 00:07:42 crc kubenswrapper[4735]: E0223 00:07:42.401544 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 00:07:42 crc kubenswrapper[4735]: E0223 00:07:42.401566 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 00:07:42 crc kubenswrapper[4735]: E0223 00:07:42.401662 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-23 00:07:43.401631572 +0000 UTC m=+21.865177583 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.402937 4735 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8" exitCode=255 Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.403001 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8"} Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.405303 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"38554e989f7f1b7d5a1301d03e8acc8619f6b0dcd8f8763b5fb15ee9665e5816"} Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.408591 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6"} Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.408695 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88"} Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.408864 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"5f69af2b0746e3436eb30a2f0fbb66ff1cf7b140e1f7ed9acf4b00ecb0f1aee6"} Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.411420 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2"} Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.411538 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"2cecc3d3aa2d12f0254caff9cd7ce86b802e210154f0a2282b7ba667b50ce9df"} Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.416082 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.416600 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.417105 4735 scope.go:117] "RemoveContainer" containerID="0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.426727 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.438836 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.453287 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.471264 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.486465 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.499470 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.511960 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.526961 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.543965 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.560209 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.581800 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.603748 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.616091 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 00:07:42 crc kubenswrapper[4735]: I0223 00:07:42.627029 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 23 00:07:43 crc kubenswrapper[4735]: I0223 00:07:43.208963 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 19:21:17.658450703 +0000 UTC Feb 23 00:07:43 crc kubenswrapper[4735]: I0223 00:07:43.271908 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:07:43 crc kubenswrapper[4735]: E0223 00:07:43.272090 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:07:43 crc kubenswrapper[4735]: I0223 00:07:43.308403 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:07:43 crc kubenswrapper[4735]: I0223 00:07:43.308509 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:07:43 crc kubenswrapper[4735]: I0223 00:07:43.308559 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:07:43 crc kubenswrapper[4735]: E0223 00:07:43.308669 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 00:07:43 crc kubenswrapper[4735]: E0223 00:07:43.308742 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:07:45.308708613 +0000 UTC m=+23.772254584 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:07:43 crc kubenswrapper[4735]: E0223 00:07:43.308796 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 00:07:45.308786715 +0000 UTC m=+23.772332686 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 00:07:43 crc kubenswrapper[4735]: E0223 00:07:43.308941 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 00:07:43 crc kubenswrapper[4735]: E0223 00:07:43.308994 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 00:07:45.3089861 +0000 UTC m=+23.772532231 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 00:07:43 crc kubenswrapper[4735]: I0223 00:07:43.409113 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:07:43 crc kubenswrapper[4735]: I0223 00:07:43.409441 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:07:43 crc kubenswrapper[4735]: E0223 00:07:43.409318 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 00:07:43 crc kubenswrapper[4735]: E0223 00:07:43.409706 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 00:07:43 crc kubenswrapper[4735]: E0223 00:07:43.409783 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 00:07:43 crc kubenswrapper[4735]: E0223 00:07:43.409950 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-23 00:07:45.409910919 +0000 UTC m=+23.873456890 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 00:07:43 crc kubenswrapper[4735]: E0223 00:07:43.409563 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 00:07:43 crc kubenswrapper[4735]: E0223 00:07:43.410114 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 00:07:43 crc kubenswrapper[4735]: E0223 00:07:43.410191 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 00:07:43 crc kubenswrapper[4735]: E0223 00:07:43.410302 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-23 00:07:45.41029019 +0000 UTC m=+23.873836171 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 00:07:43 crc kubenswrapper[4735]: I0223 00:07:43.418383 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 23 00:07:43 crc kubenswrapper[4735]: I0223 00:07:43.420760 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777"} Feb 23 00:07:43 crc kubenswrapper[4735]: I0223 00:07:43.421118 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:07:43 crc kubenswrapper[4735]: I0223 00:07:43.435541 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:43 crc kubenswrapper[4735]: I0223 00:07:43.455375 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:43 crc kubenswrapper[4735]: I0223 00:07:43.475250 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:43 crc kubenswrapper[4735]: I0223 00:07:43.501508 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:43 crc kubenswrapper[4735]: I0223 00:07:43.526202 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:43 crc kubenswrapper[4735]: I0223 00:07:43.549720 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:43 crc kubenswrapper[4735]: I0223 00:07:43.567997 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:43 crc kubenswrapper[4735]: I0223 00:07:43.593401 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:43Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:44 crc kubenswrapper[4735]: I0223 00:07:44.210043 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 17:11:35.55241239 +0000 UTC Feb 23 00:07:44 crc kubenswrapper[4735]: I0223 00:07:44.271359 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:07:44 crc kubenswrapper[4735]: I0223 00:07:44.271527 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:07:44 crc kubenswrapper[4735]: E0223 00:07:44.271753 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:07:44 crc kubenswrapper[4735]: E0223 00:07:44.271932 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:07:44 crc kubenswrapper[4735]: I0223 00:07:44.290457 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 00:07:44 crc kubenswrapper[4735]: I0223 00:07:44.297877 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 00:07:44 crc kubenswrapper[4735]: I0223 00:07:44.301138 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 23 00:07:44 crc kubenswrapper[4735]: I0223 00:07:44.309187 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:44Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:44 crc kubenswrapper[4735]: I0223 00:07:44.330321 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:44Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:44 crc kubenswrapper[4735]: I0223 00:07:44.352157 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:44Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:44 crc kubenswrapper[4735]: I0223 00:07:44.373958 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:44Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:44 crc kubenswrapper[4735]: I0223 00:07:44.387969 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:44Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:44 crc kubenswrapper[4735]: I0223 00:07:44.410953 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:44Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:44 crc kubenswrapper[4735]: I0223 00:07:44.438365 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:44Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:44 crc kubenswrapper[4735]: I0223 00:07:44.476333 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:44Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:44 crc kubenswrapper[4735]: I0223 00:07:44.503883 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:44Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:44 crc kubenswrapper[4735]: I0223 00:07:44.523247 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:44Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:44 crc kubenswrapper[4735]: I0223 00:07:44.549975 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:44Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:44 crc kubenswrapper[4735]: I0223 00:07:44.578134 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:44Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:44 crc kubenswrapper[4735]: I0223 00:07:44.597903 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:44Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:44 crc kubenswrapper[4735]: I0223 00:07:44.620374 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:44Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:44 crc kubenswrapper[4735]: I0223 00:07:44.639537 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:44Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:44 crc kubenswrapper[4735]: I0223 00:07:44.659049 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:44Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:44 crc kubenswrapper[4735]: I0223 00:07:44.680255 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:44Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:45 crc kubenswrapper[4735]: I0223 00:07:45.210821 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 01:39:57.737503514 +0000 UTC Feb 23 00:07:45 crc kubenswrapper[4735]: I0223 00:07:45.271926 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:07:45 crc kubenswrapper[4735]: E0223 00:07:45.272093 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:07:45 crc kubenswrapper[4735]: I0223 00:07:45.329579 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:07:45 crc kubenswrapper[4735]: I0223 00:07:45.329697 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:07:45 crc kubenswrapper[4735]: I0223 00:07:45.329743 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:07:45 crc kubenswrapper[4735]: E0223 00:07:45.329831 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 00:07:45 crc kubenswrapper[4735]: E0223 00:07:45.329840 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 00:07:45 crc kubenswrapper[4735]: E0223 00:07:45.329905 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 00:07:49.329892036 +0000 UTC m=+27.793438007 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 00:07:45 crc kubenswrapper[4735]: E0223 00:07:45.329928 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 00:07:49.329911677 +0000 UTC m=+27.793457678 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 00:07:45 crc kubenswrapper[4735]: E0223 00:07:45.330000 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:07:49.329951728 +0000 UTC m=+27.793497749 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:07:45 crc kubenswrapper[4735]: I0223 00:07:45.430511 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:07:45 crc kubenswrapper[4735]: I0223 00:07:45.430952 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:07:45 crc kubenswrapper[4735]: E0223 00:07:45.430780 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 00:07:45 crc kubenswrapper[4735]: E0223 00:07:45.431196 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 00:07:45 crc kubenswrapper[4735]: E0223 00:07:45.431323 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 00:07:45 crc kubenswrapper[4735]: E0223 00:07:45.431094 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 00:07:45 crc kubenswrapper[4735]: E0223 00:07:45.431463 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 00:07:45 crc kubenswrapper[4735]: E0223 00:07:45.431490 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 00:07:45 crc kubenswrapper[4735]: E0223 00:07:45.431468 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-23 00:07:49.43139022 +0000 UTC m=+27.894936271 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 00:07:45 crc kubenswrapper[4735]: E0223 00:07:45.431569 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-23 00:07:49.431550984 +0000 UTC m=+27.895096965 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 00:07:46 crc kubenswrapper[4735]: I0223 00:07:46.211939 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 00:23:33.022229429 +0000 UTC Feb 23 00:07:46 crc kubenswrapper[4735]: I0223 00:07:46.271913 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:07:46 crc kubenswrapper[4735]: E0223 00:07:46.272141 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:07:46 crc kubenswrapper[4735]: I0223 00:07:46.272438 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:07:46 crc kubenswrapper[4735]: E0223 00:07:46.272925 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:07:46 crc kubenswrapper[4735]: I0223 00:07:46.432717 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942"} Feb 23 00:07:46 crc kubenswrapper[4735]: I0223 00:07:46.458821 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:46Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:46 crc kubenswrapper[4735]: I0223 00:07:46.482753 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:46Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:46 crc kubenswrapper[4735]: I0223 00:07:46.504680 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:46Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:46 crc kubenswrapper[4735]: I0223 00:07:46.525427 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:46Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:46 crc kubenswrapper[4735]: I0223 00:07:46.547800 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:46Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:46 crc kubenswrapper[4735]: I0223 00:07:46.569464 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:46Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:46 crc kubenswrapper[4735]: I0223 00:07:46.586184 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:46Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:46 crc kubenswrapper[4735]: I0223 00:07:46.617965 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:46Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:46 crc kubenswrapper[4735]: I0223 00:07:46.640347 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:46Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:47 crc kubenswrapper[4735]: I0223 00:07:47.212272 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 19:55:46.076103108 +0000 UTC Feb 23 00:07:47 crc kubenswrapper[4735]: I0223 00:07:47.271427 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:07:47 crc kubenswrapper[4735]: E0223 00:07:47.271671 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.063880 4735 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.066133 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.066180 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.066197 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.066263 4735 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.076400 4735 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.076515 4735 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.077827 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.077927 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.077946 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.077972 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.077994 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:48Z","lastTransitionTime":"2026-02-23T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:48 crc kubenswrapper[4735]: E0223 00:07:48.108419 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc670e79-a4c0-4f94-a41e-9a217a93a98f\\\",\\\"systemUUID\\\":\\\"34aea2d1-4777-4ec2-a0dd-7ee942962cf5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:48Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.113755 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.113809 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.113826 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.113875 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.113893 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:48Z","lastTransitionTime":"2026-02-23T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:48 crc kubenswrapper[4735]: E0223 00:07:48.134197 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc670e79-a4c0-4f94-a41e-9a217a93a98f\\\",\\\"systemUUID\\\":\\\"34aea2d1-4777-4ec2-a0dd-7ee942962cf5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:48Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.139641 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.139707 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.139725 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.139750 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.139769 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:48Z","lastTransitionTime":"2026-02-23T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:48 crc kubenswrapper[4735]: E0223 00:07:48.164490 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc670e79-a4c0-4f94-a41e-9a217a93a98f\\\",\\\"systemUUID\\\":\\\"34aea2d1-4777-4ec2-a0dd-7ee942962cf5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:48Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.173827 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.173928 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.173953 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.173979 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.173997 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:48Z","lastTransitionTime":"2026-02-23T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:48 crc kubenswrapper[4735]: E0223 00:07:48.196021 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc670e79-a4c0-4f94-a41e-9a217a93a98f\\\",\\\"systemUUID\\\":\\\"34aea2d1-4777-4ec2-a0dd-7ee942962cf5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:48Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.202289 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.202338 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.202350 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.202368 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.202380 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:48Z","lastTransitionTime":"2026-02-23T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.212735 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 08:59:28.411705713 +0000 UTC Feb 23 00:07:48 crc kubenswrapper[4735]: E0223 00:07:48.224182 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc670e79-a4c0-4f94-a41e-9a217a93a98f\\\",\\\"systemUUID\\\":\\\"34aea2d1-4777-4ec2-a0dd-7ee942962cf5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:48Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:48 crc kubenswrapper[4735]: E0223 00:07:48.224582 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.227671 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.227763 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.227789 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.227819 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.227841 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:48Z","lastTransitionTime":"2026-02-23T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.271530 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.271574 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:07:48 crc kubenswrapper[4735]: E0223 00:07:48.271771 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:07:48 crc kubenswrapper[4735]: E0223 00:07:48.272043 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.330588 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.330665 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.330688 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.330718 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.330741 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:48Z","lastTransitionTime":"2026-02-23T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.433771 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.433843 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.433867 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.433924 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.433942 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:48Z","lastTransitionTime":"2026-02-23T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.536805 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.536867 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.536903 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.536927 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.536946 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:48Z","lastTransitionTime":"2026-02-23T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.585165 4735 csr.go:261] certificate signing request csr-sbl6q is approved, waiting to be issued Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.598898 4735 csr.go:257] certificate signing request csr-sbl6q is issued Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.639540 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.639582 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.639591 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.639606 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.639615 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:48Z","lastTransitionTime":"2026-02-23T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.734667 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-jfk7q"] Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.734974 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jfk7q" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.736728 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.737265 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.737404 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.741354 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.741435 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.741527 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.741547 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.741571 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:48Z","lastTransitionTime":"2026-02-23T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.750540 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:48Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.763757 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:48Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.777114 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:48Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.787897 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:48Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.800632 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:48Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.815011 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:48Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.828203 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:48Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.838507 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:48Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.843380 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.843416 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.843428 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.843444 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.843457 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:48Z","lastTransitionTime":"2026-02-23T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.860332 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:48Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.864668 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0c2131b2-ccbc-49ff-a0bc-fd6639563dd3-hosts-file\") pod \"node-resolver-jfk7q\" (UID: \"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\") " pod="openshift-dns/node-resolver-jfk7q" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.864702 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5g8g\" (UniqueName: \"kubernetes.io/projected/0c2131b2-ccbc-49ff-a0bc-fd6639563dd3-kube-api-access-r5g8g\") pod \"node-resolver-jfk7q\" (UID: \"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\") " pod="openshift-dns/node-resolver-jfk7q" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.874264 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:48Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.945359 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.945388 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.945395 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.945408 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.945417 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:48Z","lastTransitionTime":"2026-02-23T00:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.965097 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0c2131b2-ccbc-49ff-a0bc-fd6639563dd3-hosts-file\") pod \"node-resolver-jfk7q\" (UID: \"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\") " pod="openshift-dns/node-resolver-jfk7q" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.965147 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5g8g\" (UniqueName: \"kubernetes.io/projected/0c2131b2-ccbc-49ff-a0bc-fd6639563dd3-kube-api-access-r5g8g\") pod \"node-resolver-jfk7q\" (UID: \"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\") " pod="openshift-dns/node-resolver-jfk7q" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.965465 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0c2131b2-ccbc-49ff-a0bc-fd6639563dd3-hosts-file\") pod \"node-resolver-jfk7q\" (UID: \"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\") " pod="openshift-dns/node-resolver-jfk7q" Feb 23 00:07:48 crc kubenswrapper[4735]: I0223 00:07:48.988445 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5g8g\" (UniqueName: \"kubernetes.io/projected/0c2131b2-ccbc-49ff-a0bc-fd6639563dd3-kube-api-access-r5g8g\") pod \"node-resolver-jfk7q\" (UID: \"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\") " pod="openshift-dns/node-resolver-jfk7q" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.047429 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jfk7q" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.048250 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.048290 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.048304 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.048323 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.048334 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:49Z","lastTransitionTime":"2026-02-23T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:49 crc kubenswrapper[4735]: W0223 00:07:49.060398 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c2131b2_ccbc_49ff_a0bc_fd6639563dd3.slice/crio-5dec86c154e9104d1ad8995a8343c7bd083d72807a02a28fe712e040152955ac WatchSource:0}: Error finding container 5dec86c154e9104d1ad8995a8343c7bd083d72807a02a28fe712e040152955ac: Status 404 returned error can't find the container with id 5dec86c154e9104d1ad8995a8343c7bd083d72807a02a28fe712e040152955ac Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.132890 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-blmnv"] Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.133201 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.134988 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.135228 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.135288 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.135937 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.136221 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.145974 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.150123 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.150164 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.150174 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.150190 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.150201 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:49Z","lastTransitionTime":"2026-02-23T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.157846 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.175598 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.191335 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.204592 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.213009 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 22:27:46.499591681 +0000 UTC Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.218603 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.231577 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.247562 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.252147 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.252175 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.252185 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.252200 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.252211 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:49Z","lastTransitionTime":"2026-02-23T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.259482 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.267112 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1cba474f-2d55-4a07-969f-25e2817a06d0-mcd-auth-proxy-config\") pod \"machine-config-daemon-blmnv\" (UID: \"1cba474f-2d55-4a07-969f-25e2817a06d0\") " pod="openshift-machine-config-operator/machine-config-daemon-blmnv" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.267146 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp46c\" (UniqueName: \"kubernetes.io/projected/1cba474f-2d55-4a07-969f-25e2817a06d0-kube-api-access-wp46c\") pod \"machine-config-daemon-blmnv\" (UID: \"1cba474f-2d55-4a07-969f-25e2817a06d0\") " pod="openshift-machine-config-operator/machine-config-daemon-blmnv" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.267165 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1cba474f-2d55-4a07-969f-25e2817a06d0-rootfs\") pod \"machine-config-daemon-blmnv\" (UID: \"1cba474f-2d55-4a07-969f-25e2817a06d0\") " pod="openshift-machine-config-operator/machine-config-daemon-blmnv" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.267192 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1cba474f-2d55-4a07-969f-25e2817a06d0-proxy-tls\") pod \"machine-config-daemon-blmnv\" (UID: \"1cba474f-2d55-4a07-969f-25e2817a06d0\") " pod="openshift-machine-config-operator/machine-config-daemon-blmnv" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.271325 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.271416 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:07:49 crc kubenswrapper[4735]: E0223 00:07:49.271634 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.282684 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.355042 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.355141 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.355159 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.355185 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.355200 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:49Z","lastTransitionTime":"2026-02-23T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.368355 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.368473 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1cba474f-2d55-4a07-969f-25e2817a06d0-rootfs\") pod \"machine-config-daemon-blmnv\" (UID: \"1cba474f-2d55-4a07-969f-25e2817a06d0\") " pod="openshift-machine-config-operator/machine-config-daemon-blmnv" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.368509 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.368534 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1cba474f-2d55-4a07-969f-25e2817a06d0-proxy-tls\") pod \"machine-config-daemon-blmnv\" (UID: \"1cba474f-2d55-4a07-969f-25e2817a06d0\") " pod="openshift-machine-config-operator/machine-config-daemon-blmnv" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.368582 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.368624 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1cba474f-2d55-4a07-969f-25e2817a06d0-rootfs\") pod \"machine-config-daemon-blmnv\" (UID: \"1cba474f-2d55-4a07-969f-25e2817a06d0\") " pod="openshift-machine-config-operator/machine-config-daemon-blmnv" Feb 23 00:07:49 crc kubenswrapper[4735]: E0223 00:07:49.368631 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:07:57.368578506 +0000 UTC m=+35.832124487 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:07:49 crc kubenswrapper[4735]: E0223 00:07:49.368708 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 00:07:49 crc kubenswrapper[4735]: E0223 00:07:49.368723 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 00:07:49 crc kubenswrapper[4735]: E0223 00:07:49.368785 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 00:07:57.368764661 +0000 UTC m=+35.832310822 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.368712 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1cba474f-2d55-4a07-969f-25e2817a06d0-mcd-auth-proxy-config\") pod \"machine-config-daemon-blmnv\" (UID: \"1cba474f-2d55-4a07-969f-25e2817a06d0\") " pod="openshift-machine-config-operator/machine-config-daemon-blmnv" Feb 23 00:07:49 crc kubenswrapper[4735]: E0223 00:07:49.368905 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 00:07:57.368847643 +0000 UTC m=+35.832393754 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.368958 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp46c\" (UniqueName: \"kubernetes.io/projected/1cba474f-2d55-4a07-969f-25e2817a06d0-kube-api-access-wp46c\") pod \"machine-config-daemon-blmnv\" (UID: \"1cba474f-2d55-4a07-969f-25e2817a06d0\") " pod="openshift-machine-config-operator/machine-config-daemon-blmnv" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.369584 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1cba474f-2d55-4a07-969f-25e2817a06d0-mcd-auth-proxy-config\") pod \"machine-config-daemon-blmnv\" (UID: \"1cba474f-2d55-4a07-969f-25e2817a06d0\") " pod="openshift-machine-config-operator/machine-config-daemon-blmnv" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.374918 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1cba474f-2d55-4a07-969f-25e2817a06d0-proxy-tls\") pod \"machine-config-daemon-blmnv\" (UID: \"1cba474f-2d55-4a07-969f-25e2817a06d0\") " pod="openshift-machine-config-operator/machine-config-daemon-blmnv" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.387365 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp46c\" (UniqueName: \"kubernetes.io/projected/1cba474f-2d55-4a07-969f-25e2817a06d0-kube-api-access-wp46c\") pod \"machine-config-daemon-blmnv\" (UID: \"1cba474f-2d55-4a07-969f-25e2817a06d0\") " pod="openshift-machine-config-operator/machine-config-daemon-blmnv" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.441775 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jfk7q" event={"ID":"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3","Type":"ContainerStarted","Data":"a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369"} Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.441823 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jfk7q" event={"ID":"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3","Type":"ContainerStarted","Data":"5dec86c154e9104d1ad8995a8343c7bd083d72807a02a28fe712e040152955ac"} Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.444479 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" Feb 23 00:07:49 crc kubenswrapper[4735]: W0223 00:07:49.453315 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cba474f_2d55_4a07_969f_25e2817a06d0.slice/crio-d4ffd208b9481cfe73ca8793cb29bd1e747c17e2d8d8c5aca31e6ee690add5a5 WatchSource:0}: Error finding container d4ffd208b9481cfe73ca8793cb29bd1e747c17e2d8d8c5aca31e6ee690add5a5: Status 404 returned error can't find the container with id d4ffd208b9481cfe73ca8793cb29bd1e747c17e2d8d8c5aca31e6ee690add5a5 Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.462176 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.462212 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.462223 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.462244 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.462254 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:49Z","lastTransitionTime":"2026-02-23T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.463823 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.470142 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.470200 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:07:49 crc kubenswrapper[4735]: E0223 00:07:49.470352 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 00:07:49 crc kubenswrapper[4735]: E0223 00:07:49.470368 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 00:07:49 crc kubenswrapper[4735]: E0223 00:07:49.470410 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 00:07:49 crc kubenswrapper[4735]: E0223 00:07:49.470379 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 00:07:49 crc kubenswrapper[4735]: E0223 00:07:49.470425 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 00:07:49 crc kubenswrapper[4735]: E0223 00:07:49.470440 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 00:07:49 crc kubenswrapper[4735]: E0223 00:07:49.470496 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-23 00:07:57.470476711 +0000 UTC m=+35.934022682 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 00:07:49 crc kubenswrapper[4735]: E0223 00:07:49.470518 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-23 00:07:57.470511182 +0000 UTC m=+35.934057153 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.488745 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.504916 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.520943 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.524214 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-26428"] Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.524833 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-26428" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.525008 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-4gvxr"] Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.525479 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.527694 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.527819 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.528173 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.528227 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.528248 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.528572 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.529393 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.555712 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.565373 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.565408 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.565417 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.565434 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.565445 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:49Z","lastTransitionTime":"2026-02-23T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.575328 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.591719 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.600926 4735 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-23 00:02:48 +0000 UTC, rotation deadline is 2026-11-26 08:32:40.582919048 +0000 UTC Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.601138 4735 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6632h24m50.981785226s for next certificate rotation Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.606456 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.622438 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.642557 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.656949 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.667618 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.667657 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.667666 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.667682 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.667692 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:49Z","lastTransitionTime":"2026-02-23T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.669734 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.671905 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-multus-conf-dir\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.672215 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/29209462-90c5-4aa1-9943-d8b15ac1b5a3-cnibin\") pod \"multus-additional-cni-plugins-26428\" (UID: \"29209462-90c5-4aa1-9943-d8b15ac1b5a3\") " pod="openshift-multus/multus-additional-cni-plugins-26428" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.672300 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/29209462-90c5-4aa1-9943-d8b15ac1b5a3-os-release\") pod \"multus-additional-cni-plugins-26428\" (UID: \"29209462-90c5-4aa1-9943-d8b15ac1b5a3\") " pod="openshift-multus/multus-additional-cni-plugins-26428" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.672407 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5b63c18f-b6b2-4d97-b542-7800b475bd4c-cni-binary-copy\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.672500 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-os-release\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.672591 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-hostroot\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.672684 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-host-run-multus-certs\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.672762 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/29209462-90c5-4aa1-9943-d8b15ac1b5a3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-26428\" (UID: \"29209462-90c5-4aa1-9943-d8b15ac1b5a3\") " pod="openshift-multus/multus-additional-cni-plugins-26428" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.672839 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-host-run-k8s-cni-cncf-io\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.672946 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-multus-socket-dir-parent\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.673027 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-multus-cni-dir\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.673100 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-etc-kubernetes\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.673185 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-host-run-netns\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.673259 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-host-var-lib-kubelet\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.673361 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-system-cni-dir\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.673483 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4pzl\" (UniqueName: \"kubernetes.io/projected/5b63c18f-b6b2-4d97-b542-7800b475bd4c-kube-api-access-n4pzl\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.673577 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/29209462-90c5-4aa1-9943-d8b15ac1b5a3-system-cni-dir\") pod \"multus-additional-cni-plugins-26428\" (UID: \"29209462-90c5-4aa1-9943-d8b15ac1b5a3\") " pod="openshift-multus/multus-additional-cni-plugins-26428" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.673686 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/29209462-90c5-4aa1-9943-d8b15ac1b5a3-cni-binary-copy\") pod \"multus-additional-cni-plugins-26428\" (UID: \"29209462-90c5-4aa1-9943-d8b15ac1b5a3\") " pod="openshift-multus/multus-additional-cni-plugins-26428" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.673766 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-cnibin\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.673836 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-host-var-lib-cni-bin\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.673960 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-host-var-lib-cni-multus\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.674034 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6sqn\" (UniqueName: \"kubernetes.io/projected/29209462-90c5-4aa1-9943-d8b15ac1b5a3-kube-api-access-x6sqn\") pod \"multus-additional-cni-plugins-26428\" (UID: \"29209462-90c5-4aa1-9943-d8b15ac1b5a3\") " pod="openshift-multus/multus-additional-cni-plugins-26428" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.674102 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5b63c18f-b6b2-4d97-b542-7800b475bd4c-multus-daemon-config\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.674174 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/29209462-90c5-4aa1-9943-d8b15ac1b5a3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-26428\" (UID: \"29209462-90c5-4aa1-9943-d8b15ac1b5a3\") " pod="openshift-multus/multus-additional-cni-plugins-26428" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.683362 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.694528 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.706546 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29209462-90c5-4aa1-9943-d8b15ac1b5a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26428\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.717420 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.728250 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.738398 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.752291 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4gvxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b63c18f-b6b2-4d97-b542-7800b475bd4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4pzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4gvxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.765235 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.770429 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.770469 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.770480 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.770499 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.770513 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:49Z","lastTransitionTime":"2026-02-23T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.775073 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5b63c18f-b6b2-4d97-b542-7800b475bd4c-multus-daemon-config\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.775113 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/29209462-90c5-4aa1-9943-d8b15ac1b5a3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-26428\" (UID: \"29209462-90c5-4aa1-9943-d8b15ac1b5a3\") " pod="openshift-multus/multus-additional-cni-plugins-26428" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.775138 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-multus-conf-dir\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.775159 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/29209462-90c5-4aa1-9943-d8b15ac1b5a3-cnibin\") pod \"multus-additional-cni-plugins-26428\" (UID: \"29209462-90c5-4aa1-9943-d8b15ac1b5a3\") " pod="openshift-multus/multus-additional-cni-plugins-26428" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.775177 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/29209462-90c5-4aa1-9943-d8b15ac1b5a3-os-release\") pod \"multus-additional-cni-plugins-26428\" (UID: \"29209462-90c5-4aa1-9943-d8b15ac1b5a3\") " pod="openshift-multus/multus-additional-cni-plugins-26428" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.775206 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5b63c18f-b6b2-4d97-b542-7800b475bd4c-cni-binary-copy\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.775233 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-hostroot\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.775250 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-host-run-multus-certs\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.775268 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/29209462-90c5-4aa1-9943-d8b15ac1b5a3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-26428\" (UID: \"29209462-90c5-4aa1-9943-d8b15ac1b5a3\") " pod="openshift-multus/multus-additional-cni-plugins-26428" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.775285 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-os-release\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.775303 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-host-run-k8s-cni-cncf-io\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.775322 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-multus-socket-dir-parent\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.775340 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-multus-cni-dir\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.775355 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-etc-kubernetes\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.775417 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-host-run-netns\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.775449 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-host-var-lib-kubelet\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.775480 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-system-cni-dir\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.775506 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4pzl\" (UniqueName: \"kubernetes.io/projected/5b63c18f-b6b2-4d97-b542-7800b475bd4c-kube-api-access-n4pzl\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.775528 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/29209462-90c5-4aa1-9943-d8b15ac1b5a3-system-cni-dir\") pod \"multus-additional-cni-plugins-26428\" (UID: \"29209462-90c5-4aa1-9943-d8b15ac1b5a3\") " pod="openshift-multus/multus-additional-cni-plugins-26428" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.775549 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/29209462-90c5-4aa1-9943-d8b15ac1b5a3-cni-binary-copy\") pod \"multus-additional-cni-plugins-26428\" (UID: \"29209462-90c5-4aa1-9943-d8b15ac1b5a3\") " pod="openshift-multus/multus-additional-cni-plugins-26428" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.775567 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-host-var-lib-cni-bin\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.775584 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-host-var-lib-cni-multus\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.775603 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-cnibin\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.775620 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6sqn\" (UniqueName: \"kubernetes.io/projected/29209462-90c5-4aa1-9943-d8b15ac1b5a3-kube-api-access-x6sqn\") pod \"multus-additional-cni-plugins-26428\" (UID: \"29209462-90c5-4aa1-9943-d8b15ac1b5a3\") " pod="openshift-multus/multus-additional-cni-plugins-26428" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.775865 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-multus-conf-dir\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.775916 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/29209462-90c5-4aa1-9943-d8b15ac1b5a3-cnibin\") pod \"multus-additional-cni-plugins-26428\" (UID: \"29209462-90c5-4aa1-9943-d8b15ac1b5a3\") " pod="openshift-multus/multus-additional-cni-plugins-26428" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.775951 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/29209462-90c5-4aa1-9943-d8b15ac1b5a3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-26428\" (UID: \"29209462-90c5-4aa1-9943-d8b15ac1b5a3\") " pod="openshift-multus/multus-additional-cni-plugins-26428" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.775978 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/29209462-90c5-4aa1-9943-d8b15ac1b5a3-os-release\") pod \"multus-additional-cni-plugins-26428\" (UID: \"29209462-90c5-4aa1-9943-d8b15ac1b5a3\") " pod="openshift-multus/multus-additional-cni-plugins-26428" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.776014 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-host-var-lib-kubelet\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.776011 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-etc-kubernetes\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.776036 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-host-run-netns\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.775974 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5b63c18f-b6b2-4d97-b542-7800b475bd4c-multus-daemon-config\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.776069 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-system-cni-dir\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.776128 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-host-var-lib-cni-bin\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.776179 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/29209462-90c5-4aa1-9943-d8b15ac1b5a3-system-cni-dir\") pod \"multus-additional-cni-plugins-26428\" (UID: \"29209462-90c5-4aa1-9943-d8b15ac1b5a3\") " pod="openshift-multus/multus-additional-cni-plugins-26428" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.776210 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-host-var-lib-cni-multus\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.776240 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-host-run-k8s-cni-cncf-io\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.776291 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-os-release\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.776381 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-hostroot\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.776382 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-cnibin\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.776418 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-host-run-multus-certs\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.776402 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-multus-socket-dir-parent\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.776466 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b63c18f-b6b2-4d97-b542-7800b475bd4c-multus-cni-dir\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.776477 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5b63c18f-b6b2-4d97-b542-7800b475bd4c-cni-binary-copy\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.777340 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/29209462-90c5-4aa1-9943-d8b15ac1b5a3-cni-binary-copy\") pod \"multus-additional-cni-plugins-26428\" (UID: \"29209462-90c5-4aa1-9943-d8b15ac1b5a3\") " pod="openshift-multus/multus-additional-cni-plugins-26428" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.777601 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/29209462-90c5-4aa1-9943-d8b15ac1b5a3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-26428\" (UID: \"29209462-90c5-4aa1-9943-d8b15ac1b5a3\") " pod="openshift-multus/multus-additional-cni-plugins-26428" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.779660 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.795678 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.796139 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4pzl\" (UniqueName: \"kubernetes.io/projected/5b63c18f-b6b2-4d97-b542-7800b475bd4c-kube-api-access-n4pzl\") pod \"multus-4gvxr\" (UID: \"5b63c18f-b6b2-4d97-b542-7800b475bd4c\") " pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.796153 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6sqn\" (UniqueName: \"kubernetes.io/projected/29209462-90c5-4aa1-9943-d8b15ac1b5a3-kube-api-access-x6sqn\") pod \"multus-additional-cni-plugins-26428\" (UID: \"29209462-90c5-4aa1-9943-d8b15ac1b5a3\") " pod="openshift-multus/multus-additional-cni-plugins-26428" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.811971 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.835279 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-26428" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.841953 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4gvxr" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.846015 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: W0223 00:07:49.854354 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29209462_90c5_4aa1_9943_d8b15ac1b5a3.slice/crio-4d59f78b6899c4cb9e4a7126fd6e27ae8c5b8572295ddc37d3fd70faa69ba84e WatchSource:0}: Error finding container 4d59f78b6899c4cb9e4a7126fd6e27ae8c5b8572295ddc37d3fd70faa69ba84e: Status 404 returned error can't find the container with id 4d59f78b6899c4cb9e4a7126fd6e27ae8c5b8572295ddc37d3fd70faa69ba84e Feb 23 00:07:49 crc kubenswrapper[4735]: W0223 00:07:49.862574 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b63c18f_b6b2_4d97_b542_7800b475bd4c.slice/crio-a68b7be55a5a82ba7b198a2fcbda1cfb7853837593322ade77ea6c8460a1b911 WatchSource:0}: Error finding container a68b7be55a5a82ba7b198a2fcbda1cfb7853837593322ade77ea6c8460a1b911: Status 404 returned error can't find the container with id a68b7be55a5a82ba7b198a2fcbda1cfb7853837593322ade77ea6c8460a1b911 Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.872487 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.872542 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.872555 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.872578 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.872592 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:49Z","lastTransitionTime":"2026-02-23T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.935706 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-59rkm"] Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.936764 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.942505 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.942564 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.942641 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.942789 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.943194 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.943489 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.944318 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.964721 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.976121 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.976191 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.976204 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.976252 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.976264 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:49Z","lastTransitionTime":"2026-02-23T00:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.981277 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:49 crc kubenswrapper[4735]: I0223 00:07:49.997909 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.014349 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.040399 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.054813 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.077674 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29209462-90c5-4aa1-9943-d8b15ac1b5a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26428\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.078535 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.078572 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.078583 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.078598 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.078609 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:50Z","lastTransitionTime":"2026-02-23T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.079943 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-cni-netd\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.079978 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/66853c8a-9391-4291-b5f1-c72cb5fe23e8-ovnkube-config\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.080007 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/66853c8a-9391-4291-b5f1-c72cb5fe23e8-ovnkube-script-lib\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.080038 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-node-log\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.080062 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-slash\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.080083 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-var-lib-openvswitch\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.080104 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-cni-bin\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.080128 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-systemd-units\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.080149 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-run-systemd\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.080168 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.080191 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/66853c8a-9391-4291-b5f1-c72cb5fe23e8-ovn-node-metrics-cert\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.080215 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-run-ovn\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.080256 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-log-socket\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.080276 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzvft\" (UniqueName: \"kubernetes.io/projected/66853c8a-9391-4291-b5f1-c72cb5fe23e8-kube-api-access-tzvft\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.080299 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-run-netns\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.080331 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-etc-openvswitch\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.080350 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/66853c8a-9391-4291-b5f1-c72cb5fe23e8-env-overrides\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.080375 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-run-ovn-kubernetes\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.080432 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-kubelet\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.080455 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-run-openvswitch\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.098946 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.116108 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.131813 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.149224 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4gvxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b63c18f-b6b2-4d97-b542-7800b475bd4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4pzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4gvxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.172418 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66853c8a-9391-4291-b5f1-c72cb5fe23e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-59rkm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.180902 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.180933 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.180944 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.180963 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.180974 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:50Z","lastTransitionTime":"2026-02-23T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.180994 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-log-socket\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.181034 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzvft\" (UniqueName: \"kubernetes.io/projected/66853c8a-9391-4291-b5f1-c72cb5fe23e8-kube-api-access-tzvft\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.181057 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/66853c8a-9391-4291-b5f1-c72cb5fe23e8-env-overrides\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.181081 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-run-netns\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.181101 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-etc-openvswitch\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.181125 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-run-ovn-kubernetes\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.181157 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-run-openvswitch\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.181185 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-kubelet\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.181205 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-cni-netd\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.181228 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/66853c8a-9391-4291-b5f1-c72cb5fe23e8-ovnkube-config\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.181251 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/66853c8a-9391-4291-b5f1-c72cb5fe23e8-ovnkube-script-lib\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.181273 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-node-log\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.181296 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-slash\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.181320 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-var-lib-openvswitch\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.181341 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-cni-bin\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.181360 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-systemd-units\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.181355 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-run-ovn-kubernetes\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.181381 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.181404 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/66853c8a-9391-4291-b5f1-c72cb5fe23e8-ovn-node-metrics-cert\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.181426 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-run-systemd\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.181428 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-log-socket\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.181446 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-run-ovn\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.181462 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-node-log\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.181510 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-run-ovn\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.181509 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-run-openvswitch\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.181533 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-kubelet\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.181556 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-cni-netd\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.181575 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-slash\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.181608 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-var-lib-openvswitch\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.181636 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-cni-bin\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.181662 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-systemd-units\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.181689 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.181896 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-run-netns\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.181949 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-run-systemd\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.182019 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-etc-openvswitch\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.182466 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/66853c8a-9391-4291-b5f1-c72cb5fe23e8-ovnkube-config\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.182486 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/66853c8a-9391-4291-b5f1-c72cb5fe23e8-ovnkube-script-lib\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.182576 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/66853c8a-9391-4291-b5f1-c72cb5fe23e8-env-overrides\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.186188 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/66853c8a-9391-4291-b5f1-c72cb5fe23e8-ovn-node-metrics-cert\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.195733 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.198996 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzvft\" (UniqueName: \"kubernetes.io/projected/66853c8a-9391-4291-b5f1-c72cb5fe23e8-kube-api-access-tzvft\") pod \"ovnkube-node-59rkm\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.208984 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.213242 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 19:54:08.170918236 +0000 UTC Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.269654 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.271291 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.271330 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:07:50 crc kubenswrapper[4735]: E0223 00:07:50.271437 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:07:50 crc kubenswrapper[4735]: E0223 00:07:50.271587 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.291374 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.291416 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.291426 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.291442 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.291484 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:50Z","lastTransitionTime":"2026-02-23T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:50 crc kubenswrapper[4735]: W0223 00:07:50.302106 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66853c8a_9391_4291_b5f1_c72cb5fe23e8.slice/crio-c050d294b8e371b56c70146fe65693fe67fe1c27d70e8f710586a0ea9aff1f67 WatchSource:0}: Error finding container c050d294b8e371b56c70146fe65693fe67fe1c27d70e8f710586a0ea9aff1f67: Status 404 returned error can't find the container with id c050d294b8e371b56c70146fe65693fe67fe1c27d70e8f710586a0ea9aff1f67 Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.394157 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.394209 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.394221 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.394240 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.394251 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:50Z","lastTransitionTime":"2026-02-23T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.447888 4735 generic.go:334] "Generic (PLEG): container finished" podID="29209462-90c5-4aa1-9943-d8b15ac1b5a3" containerID="79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6" exitCode=0 Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.447981 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" event={"ID":"29209462-90c5-4aa1-9943-d8b15ac1b5a3","Type":"ContainerDied","Data":"79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6"} Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.448053 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" event={"ID":"29209462-90c5-4aa1-9943-d8b15ac1b5a3","Type":"ContainerStarted","Data":"4d59f78b6899c4cb9e4a7126fd6e27ae8c5b8572295ddc37d3fd70faa69ba84e"} Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.450992 4735 generic.go:334] "Generic (PLEG): container finished" podID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerID="1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee" exitCode=0 Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.451077 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" event={"ID":"66853c8a-9391-4291-b5f1-c72cb5fe23e8","Type":"ContainerDied","Data":"1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee"} Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.451158 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" event={"ID":"66853c8a-9391-4291-b5f1-c72cb5fe23e8","Type":"ContainerStarted","Data":"c050d294b8e371b56c70146fe65693fe67fe1c27d70e8f710586a0ea9aff1f67"} Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.453254 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4gvxr" event={"ID":"5b63c18f-b6b2-4d97-b542-7800b475bd4c","Type":"ContainerStarted","Data":"33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38"} Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.453313 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4gvxr" event={"ID":"5b63c18f-b6b2-4d97-b542-7800b475bd4c","Type":"ContainerStarted","Data":"a68b7be55a5a82ba7b198a2fcbda1cfb7853837593322ade77ea6c8460a1b911"} Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.456159 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" event={"ID":"1cba474f-2d55-4a07-969f-25e2817a06d0","Type":"ContainerStarted","Data":"b3ae04d2c5dac3e689b744efd8990e7210f627cf79b77aab8cdb477e22eedeef"} Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.456218 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" event={"ID":"1cba474f-2d55-4a07-969f-25e2817a06d0","Type":"ContainerStarted","Data":"908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0"} Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.456233 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" event={"ID":"1cba474f-2d55-4a07-969f-25e2817a06d0","Type":"ContainerStarted","Data":"d4ffd208b9481cfe73ca8793cb29bd1e747c17e2d8d8c5aca31e6ee690add5a5"} Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.464070 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.487728 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.497279 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.497316 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.497331 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.497349 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.497363 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:50Z","lastTransitionTime":"2026-02-23T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.504054 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.516139 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.525652 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.538654 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29209462-90c5-4aa1-9943-d8b15ac1b5a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26428\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.550596 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.564663 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.576827 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.591748 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4gvxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b63c18f-b6b2-4d97-b542-7800b475bd4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4pzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4gvxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.603077 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.603116 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.603127 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.603145 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.603158 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:50Z","lastTransitionTime":"2026-02-23T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.614161 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66853c8a-9391-4291-b5f1-c72cb5fe23e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-59rkm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.638014 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.650038 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.662467 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.676009 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.688187 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.700075 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.706757 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.706796 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.706806 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.706822 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.706836 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:50Z","lastTransitionTime":"2026-02-23T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.713255 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.724409 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.738459 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3ae04d2c5dac3e689b744efd8990e7210f627cf79b77aab8cdb477e22eedeef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.751809 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29209462-90c5-4aa1-9943-d8b15ac1b5a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26428\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.765035 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.776610 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.788624 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.806655 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4gvxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b63c18f-b6b2-4d97-b542-7800b475bd4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4pzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4gvxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.809580 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.809622 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.809631 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.809646 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.809656 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:50Z","lastTransitionTime":"2026-02-23T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.831411 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66853c8a-9391-4291-b5f1-c72cb5fe23e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-59rkm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.861730 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.875003 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:50Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.912921 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.912974 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.912989 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.913010 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:50 crc kubenswrapper[4735]: I0223 00:07:50.913022 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:50Z","lastTransitionTime":"2026-02-23T00:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.017561 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.017612 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.017631 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.017655 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.017676 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:51Z","lastTransitionTime":"2026-02-23T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.061677 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-zhj4f"] Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.062126 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zhj4f" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.065396 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.065991 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.066188 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.066278 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.077330 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.094174 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.094334 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e1dd002d-a62b-432e-bc82-21ffdfc5e0b7-serviceca\") pod \"node-ca-zhj4f\" (UID: \"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\") " pod="openshift-image-registry/node-ca-zhj4f" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.094388 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8649q\" (UniqueName: \"kubernetes.io/projected/e1dd002d-a62b-432e-bc82-21ffdfc5e0b7-kube-api-access-8649q\") pod \"node-ca-zhj4f\" (UID: \"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\") " pod="openshift-image-registry/node-ca-zhj4f" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.094519 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e1dd002d-a62b-432e-bc82-21ffdfc5e0b7-host\") pod \"node-ca-zhj4f\" (UID: \"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\") " pod="openshift-image-registry/node-ca-zhj4f" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.107845 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zhj4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8649q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zhj4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.119964 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.120002 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.120017 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.120036 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.120052 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:51Z","lastTransitionTime":"2026-02-23T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.128605 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.141807 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.160025 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.174217 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3ae04d2c5dac3e689b744efd8990e7210f627cf79b77aab8cdb477e22eedeef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.194704 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29209462-90c5-4aa1-9943-d8b15ac1b5a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26428\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.195316 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e1dd002d-a62b-432e-bc82-21ffdfc5e0b7-host\") pod \"node-ca-zhj4f\" (UID: \"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\") " pod="openshift-image-registry/node-ca-zhj4f" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.195383 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e1dd002d-a62b-432e-bc82-21ffdfc5e0b7-serviceca\") pod \"node-ca-zhj4f\" (UID: \"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\") " pod="openshift-image-registry/node-ca-zhj4f" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.195404 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8649q\" (UniqueName: \"kubernetes.io/projected/e1dd002d-a62b-432e-bc82-21ffdfc5e0b7-kube-api-access-8649q\") pod \"node-ca-zhj4f\" (UID: \"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\") " pod="openshift-image-registry/node-ca-zhj4f" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.195434 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e1dd002d-a62b-432e-bc82-21ffdfc5e0b7-host\") pod \"node-ca-zhj4f\" (UID: \"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\") " pod="openshift-image-registry/node-ca-zhj4f" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.196287 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e1dd002d-a62b-432e-bc82-21ffdfc5e0b7-serviceca\") pod \"node-ca-zhj4f\" (UID: \"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\") " pod="openshift-image-registry/node-ca-zhj4f" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.214787 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 22:48:28.449745425 +0000 UTC Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.217629 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8649q\" (UniqueName: \"kubernetes.io/projected/e1dd002d-a62b-432e-bc82-21ffdfc5e0b7-kube-api-access-8649q\") pod \"node-ca-zhj4f\" (UID: \"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\") " pod="openshift-image-registry/node-ca-zhj4f" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.224415 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.224455 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.224464 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.224482 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.224498 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:51Z","lastTransitionTime":"2026-02-23T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.232154 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.270555 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.271640 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:07:51 crc kubenswrapper[4735]: E0223 00:07:51.271800 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.308396 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.326640 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.326695 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.326714 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.326734 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.326749 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:51Z","lastTransitionTime":"2026-02-23T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.346227 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4gvxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b63c18f-b6b2-4d97-b542-7800b475bd4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4pzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4gvxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.374764 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zhj4f" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.397092 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66853c8a-9391-4291-b5f1-c72cb5fe23e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-59rkm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:51 crc kubenswrapper[4735]: W0223 00:07:51.402728 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1dd002d_a62b_432e_bc82_21ffdfc5e0b7.slice/crio-7ac10e0bee38821fda562fbad4fa5f9b8324a9a1f3fc5d0ca02843460f3c8ea7 WatchSource:0}: Error finding container 7ac10e0bee38821fda562fbad4fa5f9b8324a9a1f3fc5d0ca02843460f3c8ea7: Status 404 returned error can't find the container with id 7ac10e0bee38821fda562fbad4fa5f9b8324a9a1f3fc5d0ca02843460f3c8ea7 Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.429566 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.430060 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.430072 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.430090 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.430102 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:51Z","lastTransitionTime":"2026-02-23T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.437499 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.461514 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zhj4f" event={"ID":"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7","Type":"ContainerStarted","Data":"7ac10e0bee38821fda562fbad4fa5f9b8324a9a1f3fc5d0ca02843460f3c8ea7"} Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.465245 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" event={"ID":"29209462-90c5-4aa1-9943-d8b15ac1b5a3","Type":"ContainerStarted","Data":"daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7"} Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.467564 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.472553 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" event={"ID":"66853c8a-9391-4291-b5f1-c72cb5fe23e8","Type":"ContainerStarted","Data":"2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd"} Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.472592 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" event={"ID":"66853c8a-9391-4291-b5f1-c72cb5fe23e8","Type":"ContainerStarted","Data":"b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716"} Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.472603 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" event={"ID":"66853c8a-9391-4291-b5f1-c72cb5fe23e8","Type":"ContainerStarted","Data":"8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442"} Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.472615 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" event={"ID":"66853c8a-9391-4291-b5f1-c72cb5fe23e8","Type":"ContainerStarted","Data":"b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7"} Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.472627 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" event={"ID":"66853c8a-9391-4291-b5f1-c72cb5fe23e8","Type":"ContainerStarted","Data":"9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2"} Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.505594 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.532360 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.532423 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.532433 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.532472 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.532483 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:51Z","lastTransitionTime":"2026-02-23T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.545278 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.583272 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zhj4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8649q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zhj4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.627570 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29209462-90c5-4aa1-9943-d8b15ac1b5a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26428\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.635228 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.635258 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.635270 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.635288 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.635303 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:51Z","lastTransitionTime":"2026-02-23T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.666177 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.706389 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.738074 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.738145 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.738157 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.738178 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.738208 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:51Z","lastTransitionTime":"2026-02-23T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.751937 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.788091 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3ae04d2c5dac3e689b744efd8990e7210f627cf79b77aab8cdb477e22eedeef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.841298 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.841345 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.841361 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.841383 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.841397 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:51Z","lastTransitionTime":"2026-02-23T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.842025 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66853c8a-9391-4291-b5f1-c72cb5fe23e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-59rkm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.873336 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.908633 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.944111 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.944172 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.944189 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.944216 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.944237 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:51Z","lastTransitionTime":"2026-02-23T00:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.947067 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:51 crc kubenswrapper[4735]: I0223 00:07:51.991588 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4gvxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b63c18f-b6b2-4d97-b542-7800b475bd4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4pzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4gvxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:51Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.051059 4735 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.053281 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.053359 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.053381 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.053414 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.053436 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:52Z","lastTransitionTime":"2026-02-23T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:52 crc kubenswrapper[4735]: W0223 00:07:52.053805 4735 reflector.go:484] object-"openshift-image-registry"/"node-ca-dockercfg-4777p": watch of *v1.Secret ended with: very short watch: object-"openshift-image-registry"/"node-ca-dockercfg-4777p": Unexpected watch close - watch lasted less than a second and no items received Feb 23 00:07:52 crc kubenswrapper[4735]: W0223 00:07:52.053985 4735 reflector.go:484] object-"openshift-image-registry"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-image-registry"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.054451 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd/pods/etcd-crc/status\": read tcp 38.102.83.213:43092->38.102.83.213:6443: use of closed network connection" Feb 23 00:07:52 crc kubenswrapper[4735]: W0223 00:07:52.055892 4735 reflector.go:484] object-"openshift-image-registry"/"image-registry-certificates": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-image-registry"/"image-registry-certificates": Unexpected watch close - watch lasted less than a second and no items received Feb 23 00:07:52 crc kubenswrapper[4735]: W0223 00:07:52.056042 4735 reflector.go:484] object-"openshift-image-registry"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-image-registry"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.085066 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:52Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.156668 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.156733 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.156751 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.156775 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.156790 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:52Z","lastTransitionTime":"2026-02-23T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.215964 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 21:21:14.824026708 +0000 UTC Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.260059 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.260110 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.260122 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.260142 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.260155 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:52Z","lastTransitionTime":"2026-02-23T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.271461 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:07:52 crc kubenswrapper[4735]: E0223 00:07:52.271588 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.275982 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:07:52 crc kubenswrapper[4735]: E0223 00:07:52.276147 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.296834 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:52Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.309071 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:52Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.320056 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4gvxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b63c18f-b6b2-4d97-b542-7800b475bd4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4pzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4gvxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:52Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.348599 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66853c8a-9391-4291-b5f1-c72cb5fe23e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-59rkm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:52Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.360567 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:52Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.362246 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.362290 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.362302 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.362320 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.362332 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:52Z","lastTransitionTime":"2026-02-23T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.385230 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:52Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.395570 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:52Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.410242 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:52Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.422423 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zhj4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8649q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zhj4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:52Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.464892 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.465665 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.465681 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.465706 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.465721 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:52Z","lastTransitionTime":"2026-02-23T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.466251 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:52Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.477640 4735 generic.go:334] "Generic (PLEG): container finished" podID="29209462-90c5-4aa1-9943-d8b15ac1b5a3" containerID="daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7" exitCode=0 Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.477735 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" event={"ID":"29209462-90c5-4aa1-9943-d8b15ac1b5a3","Type":"ContainerDied","Data":"daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7"} Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.479279 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zhj4f" event={"ID":"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7","Type":"ContainerStarted","Data":"a3765b3d0261e124aa7d128e9bdf43a6c76939b5d63f3960ab38a129bf3476c3"} Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.483283 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" event={"ID":"66853c8a-9391-4291-b5f1-c72cb5fe23e8","Type":"ContainerStarted","Data":"a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527"} Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.508581 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:52Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.550384 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:52Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.569637 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.569690 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.569703 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.569728 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.569742 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:52Z","lastTransitionTime":"2026-02-23T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.585797 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3ae04d2c5dac3e689b744efd8990e7210f627cf79b77aab8cdb477e22eedeef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:52Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.632746 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29209462-90c5-4aa1-9943-d8b15ac1b5a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26428\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:52Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.668533 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:52Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.673582 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.673621 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.673637 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.673653 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.673665 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:52Z","lastTransitionTime":"2026-02-23T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.708438 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:52Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.755956 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:52Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.776494 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.776548 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.776562 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.776582 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.776596 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:52Z","lastTransitionTime":"2026-02-23T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.786206 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zhj4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3765b3d0261e124aa7d128e9bdf43a6c76939b5d63f3960ab38a129bf3476c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8649q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zhj4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:52Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.827515 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:52Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.866139 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:52Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.880000 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.880087 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.880113 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.880147 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.880171 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:52Z","lastTransitionTime":"2026-02-23T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.909123 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:52Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.944155 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3ae04d2c5dac3e689b744efd8990e7210f627cf79b77aab8cdb477e22eedeef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:52Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.982642 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.982686 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.982696 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.982715 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.982729 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:52Z","lastTransitionTime":"2026-02-23T00:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:52 crc kubenswrapper[4735]: I0223 00:07:52.993066 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29209462-90c5-4aa1-9943-d8b15ac1b5a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26428\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:52Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.031649 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.037451 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.085409 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.085453 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.085465 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.085482 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.085492 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:53Z","lastTransitionTime":"2026-02-23T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.088939 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.130573 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.175917 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4gvxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b63c18f-b6b2-4d97-b542-7800b475bd4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4pzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4gvxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.188371 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.188429 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.188439 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.188458 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.188470 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:53Z","lastTransitionTime":"2026-02-23T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.216532 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 03:46:21.330605052 +0000 UTC Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.223427 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66853c8a-9391-4291-b5f1-c72cb5fe23e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-59rkm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.262786 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.271233 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:07:53 crc kubenswrapper[4735]: E0223 00:07:53.271404 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.288267 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.292507 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.292608 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.292636 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.292684 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.292749 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:53Z","lastTransitionTime":"2026-02-23T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.396029 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.396104 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.396123 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.396150 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.396173 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:53Z","lastTransitionTime":"2026-02-23T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.491782 4735 generic.go:334] "Generic (PLEG): container finished" podID="29209462-90c5-4aa1-9943-d8b15ac1b5a3" containerID="c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec" exitCode=0 Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.491870 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" event={"ID":"29209462-90c5-4aa1-9943-d8b15ac1b5a3","Type":"ContainerDied","Data":"c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec"} Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.499888 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.499986 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.500009 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.500033 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.500089 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:53Z","lastTransitionTime":"2026-02-23T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.511806 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.514573 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.516158 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.541008 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.558479 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zhj4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3765b3d0261e124aa7d128e9bdf43a6c76939b5d63f3960ab38a129bf3476c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8649q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zhj4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.578837 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.599943 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.602870 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.603058 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.603074 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.603092 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.603104 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:53Z","lastTransitionTime":"2026-02-23T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.620021 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.634608 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.639898 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3ae04d2c5dac3e689b744efd8990e7210f627cf79b77aab8cdb477e22eedeef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.671450 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29209462-90c5-4aa1-9943-d8b15ac1b5a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26428\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.705568 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.705600 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.705608 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.705622 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.705650 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:53Z","lastTransitionTime":"2026-02-23T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.708366 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.751324 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.786151 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.809317 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.809357 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.809372 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.809395 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.809408 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:53Z","lastTransitionTime":"2026-02-23T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.830395 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4gvxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b63c18f-b6b2-4d97-b542-7800b475bd4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4pzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4gvxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.874965 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66853c8a-9391-4291-b5f1-c72cb5fe23e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-59rkm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.913728 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.913809 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.913833 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.913902 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.913929 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:53Z","lastTransitionTime":"2026-02-23T00:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.916693 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:53 crc kubenswrapper[4735]: I0223 00:07:53.946510 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:53Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.016962 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.017115 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.017134 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.017153 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.017169 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:54Z","lastTransitionTime":"2026-02-23T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.121468 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.121535 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.121557 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.121585 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.121604 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:54Z","lastTransitionTime":"2026-02-23T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.217441 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 17:23:23.921334182 +0000 UTC Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.226003 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.226056 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.226074 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.226100 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.226117 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:54Z","lastTransitionTime":"2026-02-23T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.271415 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.271427 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:07:54 crc kubenswrapper[4735]: E0223 00:07:54.272095 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:07:54 crc kubenswrapper[4735]: E0223 00:07:54.272258 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.328908 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.328955 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.328967 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.328981 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.328995 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:54Z","lastTransitionTime":"2026-02-23T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.433146 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.433212 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.433224 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.433242 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.433253 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:54Z","lastTransitionTime":"2026-02-23T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.525487 4735 generic.go:334] "Generic (PLEG): container finished" podID="29209462-90c5-4aa1-9943-d8b15ac1b5a3" containerID="13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214" exitCode=0 Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.525574 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" event={"ID":"29209462-90c5-4aa1-9943-d8b15ac1b5a3","Type":"ContainerDied","Data":"13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214"} Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.537509 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.537535 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.537545 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.537560 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.537571 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:54Z","lastTransitionTime":"2026-02-23T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.538437 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" event={"ID":"66853c8a-9391-4291-b5f1-c72cb5fe23e8","Type":"ContainerStarted","Data":"fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f"} Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.542580 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:54Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.557867 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:54Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.569088 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zhj4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3765b3d0261e124aa7d128e9bdf43a6c76939b5d63f3960ab38a129bf3476c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8649q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zhj4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:54Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.582256 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:54Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.595800 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:54Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.616431 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:54Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.630381 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3ae04d2c5dac3e689b744efd8990e7210f627cf79b77aab8cdb477e22eedeef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:54Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.640965 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.641034 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.641058 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.641086 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.641109 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:54Z","lastTransitionTime":"2026-02-23T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.646586 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29209462-90c5-4aa1-9943-d8b15ac1b5a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26428\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:54Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.667502 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:54Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.686614 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:54Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.704642 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:54Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.718283 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4gvxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b63c18f-b6b2-4d97-b542-7800b475bd4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4pzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4gvxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:54Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.742457 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66853c8a-9391-4291-b5f1-c72cb5fe23e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-59rkm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:54Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.743775 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.743836 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.743891 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.743923 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.743945 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:54Z","lastTransitionTime":"2026-02-23T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.769685 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:54Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.781793 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:54Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.847231 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.847285 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.847302 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.847327 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.847344 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:54Z","lastTransitionTime":"2026-02-23T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.949837 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.949965 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.949989 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.950023 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:54 crc kubenswrapper[4735]: I0223 00:07:54.950045 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:54Z","lastTransitionTime":"2026-02-23T00:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.057140 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.057210 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.057227 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.057248 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.057270 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:55Z","lastTransitionTime":"2026-02-23T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.160672 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.160773 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.160792 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.160819 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.160837 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:55Z","lastTransitionTime":"2026-02-23T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.218403 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 14:05:06.543706226 +0000 UTC Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.264114 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.264198 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.264219 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.264254 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.264274 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:55Z","lastTransitionTime":"2026-02-23T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.271653 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:07:55 crc kubenswrapper[4735]: E0223 00:07:55.271833 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.367699 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.367753 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.367768 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.367787 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.367801 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:55Z","lastTransitionTime":"2026-02-23T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.470363 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.470422 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.470444 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.470469 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.470486 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:55Z","lastTransitionTime":"2026-02-23T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.548916 4735 generic.go:334] "Generic (PLEG): container finished" podID="29209462-90c5-4aa1-9943-d8b15ac1b5a3" containerID="d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f" exitCode=0 Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.548968 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" event={"ID":"29209462-90c5-4aa1-9943-d8b15ac1b5a3","Type":"ContainerDied","Data":"d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f"} Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.573448 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.573520 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.573538 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.573565 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.573582 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:55Z","lastTransitionTime":"2026-02-23T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.575089 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:55Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.598303 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:55Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.616786 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:55Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.630581 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3ae04d2c5dac3e689b744efd8990e7210f627cf79b77aab8cdb477e22eedeef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:55Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.648289 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29209462-90c5-4aa1-9943-d8b15ac1b5a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26428\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:55Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.669384 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:55Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.676531 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.676575 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.676586 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.676603 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.676615 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:55Z","lastTransitionTime":"2026-02-23T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.690115 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:55Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.726295 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:55Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.771711 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4gvxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b63c18f-b6b2-4d97-b542-7800b475bd4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4pzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4gvxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:55Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.779314 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.779342 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.779351 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.779367 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.779375 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:55Z","lastTransitionTime":"2026-02-23T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.794436 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66853c8a-9391-4291-b5f1-c72cb5fe23e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-59rkm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:55Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.813890 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:55Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.853908 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:55Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.870719 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:55Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.882193 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.882249 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.882259 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.882273 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.882283 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:55Z","lastTransitionTime":"2026-02-23T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.886556 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:55Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.897498 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zhj4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3765b3d0261e124aa7d128e9bdf43a6c76939b5d63f3960ab38a129bf3476c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8649q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zhj4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:55Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.985551 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.985617 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.985635 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.985661 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:55 crc kubenswrapper[4735]: I0223 00:07:55.985690 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:55Z","lastTransitionTime":"2026-02-23T00:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.089129 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.089201 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.089218 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.089243 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.089261 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:56Z","lastTransitionTime":"2026-02-23T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.193240 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.193295 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.193312 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.193337 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.193354 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:56Z","lastTransitionTime":"2026-02-23T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.219410 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 21:38:55.325326434 +0000 UTC Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.271267 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.271305 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:07:56 crc kubenswrapper[4735]: E0223 00:07:56.271473 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:07:56 crc kubenswrapper[4735]: E0223 00:07:56.271603 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.296970 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.297025 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.297045 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.297074 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.297096 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:56Z","lastTransitionTime":"2026-02-23T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.401300 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.401365 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.401385 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.401420 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.401438 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:56Z","lastTransitionTime":"2026-02-23T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.504930 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.505164 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.505306 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.505454 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.505577 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:56Z","lastTransitionTime":"2026-02-23T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.559028 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" event={"ID":"66853c8a-9391-4291-b5f1-c72cb5fe23e8","Type":"ContainerStarted","Data":"f1fbdabf48ee491025ac86e29f91e3da94414ebbe726d99e706b7ed31fcce58c"} Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.559341 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.561107 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.561261 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.568250 4735 generic.go:334] "Generic (PLEG): container finished" podID="29209462-90c5-4aa1-9943-d8b15ac1b5a3" containerID="cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8" exitCode=0 Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.568362 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" event={"ID":"29209462-90c5-4aa1-9943-d8b15ac1b5a3","Type":"ContainerDied","Data":"cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8"} Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.582213 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.587991 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.593730 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.602616 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.608965 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.609020 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.609038 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.609064 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.609084 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:56Z","lastTransitionTime":"2026-02-23T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.625996 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.650505 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4gvxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b63c18f-b6b2-4d97-b542-7800b475bd4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4pzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4gvxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.687969 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66853c8a-9391-4291-b5f1-c72cb5fe23e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fbdabf48ee491025ac86e29f91e3da94414ebbe726d99e706b7ed31fcce58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-59rkm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.712517 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.712565 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.712573 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.712587 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.712595 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:56Z","lastTransitionTime":"2026-02-23T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.723684 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.738851 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.752339 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.771026 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.789678 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zhj4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3765b3d0261e124aa7d128e9bdf43a6c76939b5d63f3960ab38a129bf3476c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8649q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zhj4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.812479 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.815722 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.815766 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.815781 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.815802 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.815818 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:56Z","lastTransitionTime":"2026-02-23T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.825465 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.837029 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.850404 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3ae04d2c5dac3e689b744efd8990e7210f627cf79b77aab8cdb477e22eedeef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.865082 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29209462-90c5-4aa1-9943-d8b15ac1b5a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26428\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.883175 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.895612 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.909742 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4gvxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b63c18f-b6b2-4d97-b542-7800b475bd4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4pzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4gvxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.918033 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.918073 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.918085 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.918103 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.918116 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:56Z","lastTransitionTime":"2026-02-23T00:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.931537 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66853c8a-9391-4291-b5f1-c72cb5fe23e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fbdabf48ee491025ac86e29f91e3da94414ebbe726d99e706b7ed31fcce58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-59rkm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.946298 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.976666 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:56 crc kubenswrapper[4735]: I0223 00:07:56.998312 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:56Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.017569 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:57Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.020797 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.020895 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.020923 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.020951 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.020969 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:57Z","lastTransitionTime":"2026-02-23T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.031840 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zhj4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3765b3d0261e124aa7d128e9bdf43a6c76939b5d63f3960ab38a129bf3476c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8649q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zhj4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:57Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.050001 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:57Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.070993 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:57Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.090256 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:57Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.106005 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3ae04d2c5dac3e689b744efd8990e7210f627cf79b77aab8cdb477e22eedeef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:57Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.123719 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.123781 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.123801 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.123826 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.123845 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:57Z","lastTransitionTime":"2026-02-23T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.128386 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29209462-90c5-4aa1-9943-d8b15ac1b5a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26428\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:57Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.149166 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:57Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.220333 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 13:03:40.365050456 +0000 UTC Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.226805 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.227511 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.227642 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.227735 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.227827 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:57Z","lastTransitionTime":"2026-02-23T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.271749 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:07:57 crc kubenswrapper[4735]: E0223 00:07:57.272344 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.331541 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.331604 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.331623 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.331647 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.331664 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:57Z","lastTransitionTime":"2026-02-23T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.374183 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.374341 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:07:57 crc kubenswrapper[4735]: E0223 00:07:57.374442 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:08:13.374407832 +0000 UTC m=+51.837953843 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:07:57 crc kubenswrapper[4735]: E0223 00:07:57.374504 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.374509 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:07:57 crc kubenswrapper[4735]: E0223 00:07:57.374568 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 00:07:57 crc kubenswrapper[4735]: E0223 00:07:57.374620 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 00:08:13.374605216 +0000 UTC m=+51.838151217 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 00:07:57 crc kubenswrapper[4735]: E0223 00:07:57.374654 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 00:08:13.374630667 +0000 UTC m=+51.838176678 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.435135 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.435192 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.435209 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.435231 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.435249 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:57Z","lastTransitionTime":"2026-02-23T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.475745 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.475906 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:07:57 crc kubenswrapper[4735]: E0223 00:07:57.475975 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 00:07:57 crc kubenswrapper[4735]: E0223 00:07:57.476021 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 00:07:57 crc kubenswrapper[4735]: E0223 00:07:57.476047 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 00:07:57 crc kubenswrapper[4735]: E0223 00:07:57.476053 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 00:07:57 crc kubenswrapper[4735]: E0223 00:07:57.476077 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 00:07:57 crc kubenswrapper[4735]: E0223 00:07:57.476157 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 00:07:57 crc kubenswrapper[4735]: E0223 00:07:57.476160 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-23 00:08:13.476118231 +0000 UTC m=+51.939664242 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 00:07:57 crc kubenswrapper[4735]: E0223 00:07:57.476227 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-23 00:08:13.476210034 +0000 UTC m=+51.939756035 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.538910 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.538986 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.539005 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.539031 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.539048 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:57Z","lastTransitionTime":"2026-02-23T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.578078 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" event={"ID":"29209462-90c5-4aa1-9943-d8b15ac1b5a3","Type":"ContainerStarted","Data":"23e62d470b47146565ead752e6fb5578b3951831fd10126ec5ba4752e6a42e66"} Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.602553 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:57Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.623178 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:57Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.642239 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.642313 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.642339 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.642373 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.642394 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:57Z","lastTransitionTime":"2026-02-23T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.642580 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:57Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.660380 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3ae04d2c5dac3e689b744efd8990e7210f627cf79b77aab8cdb477e22eedeef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:57Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.684663 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29209462-90c5-4aa1-9943-d8b15ac1b5a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23e62d470b47146565ead752e6fb5578b3951831fd10126ec5ba4752e6a42e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26428\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:57Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.706216 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:57Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.725176 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:57Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.745784 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.745880 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.745819 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:57Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.745906 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.745931 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.745949 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:57Z","lastTransitionTime":"2026-02-23T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.766935 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4gvxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b63c18f-b6b2-4d97-b542-7800b475bd4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4pzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4gvxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:57Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.798204 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66853c8a-9391-4291-b5f1-c72cb5fe23e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fbdabf48ee491025ac86e29f91e3da94414ebbe726d99e706b7ed31fcce58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-59rkm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:57Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.835220 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:57Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.849434 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.849484 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.849502 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.849530 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.849553 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:57Z","lastTransitionTime":"2026-02-23T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.852316 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:57Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.873280 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:57Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.892740 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:57Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.910093 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zhj4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3765b3d0261e124aa7d128e9bdf43a6c76939b5d63f3960ab38a129bf3476c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8649q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zhj4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:57Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.952290 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.952333 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.952345 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.952365 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:57 crc kubenswrapper[4735]: I0223 00:07:57.952381 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:57Z","lastTransitionTime":"2026-02-23T00:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.055525 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.055588 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.055604 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.055631 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.055647 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:58Z","lastTransitionTime":"2026-02-23T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.158802 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.158877 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.158896 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.158915 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.158929 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:58Z","lastTransitionTime":"2026-02-23T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.220678 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 09:09:43.470843793 +0000 UTC Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.262347 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.262420 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.262444 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.262472 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.262493 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:58Z","lastTransitionTime":"2026-02-23T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.272010 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.272047 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:07:58 crc kubenswrapper[4735]: E0223 00:07:58.272267 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:07:58 crc kubenswrapper[4735]: E0223 00:07:58.272376 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.384302 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.384348 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.384359 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.384376 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.384388 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:58Z","lastTransitionTime":"2026-02-23T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.441012 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.441054 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.441066 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.441084 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.441097 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:58Z","lastTransitionTime":"2026-02-23T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:58 crc kubenswrapper[4735]: E0223 00:07:58.456779 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc670e79-a4c0-4f94-a41e-9a217a93a98f\\\",\\\"systemUUID\\\":\\\"34aea2d1-4777-4ec2-a0dd-7ee942962cf5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:58Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.461947 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.461989 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.462000 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.462017 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.462029 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:58Z","lastTransitionTime":"2026-02-23T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:58 crc kubenswrapper[4735]: E0223 00:07:58.476295 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc670e79-a4c0-4f94-a41e-9a217a93a98f\\\",\\\"systemUUID\\\":\\\"34aea2d1-4777-4ec2-a0dd-7ee942962cf5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:58Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.480301 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.480346 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.480360 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.480377 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.480391 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:58Z","lastTransitionTime":"2026-02-23T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:58 crc kubenswrapper[4735]: E0223 00:07:58.490990 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc670e79-a4c0-4f94-a41e-9a217a93a98f\\\",\\\"systemUUID\\\":\\\"34aea2d1-4777-4ec2-a0dd-7ee942962cf5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:58Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.494271 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.494311 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.494323 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.494339 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.494350 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:58Z","lastTransitionTime":"2026-02-23T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:58 crc kubenswrapper[4735]: E0223 00:07:58.507149 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc670e79-a4c0-4f94-a41e-9a217a93a98f\\\",\\\"systemUUID\\\":\\\"34aea2d1-4777-4ec2-a0dd-7ee942962cf5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:58Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.509822 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.509913 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.509933 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.509956 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.509974 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:58Z","lastTransitionTime":"2026-02-23T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:58 crc kubenswrapper[4735]: E0223 00:07:58.522911 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc670e79-a4c0-4f94-a41e-9a217a93a98f\\\",\\\"systemUUID\\\":\\\"34aea2d1-4777-4ec2-a0dd-7ee942962cf5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:58Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:58 crc kubenswrapper[4735]: E0223 00:07:58.523079 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.524481 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.524510 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.524523 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.524539 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.524551 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:58Z","lastTransitionTime":"2026-02-23T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.627634 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.627678 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.627691 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.627712 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.627724 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:58Z","lastTransitionTime":"2026-02-23T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.731301 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.731365 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.731383 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.731409 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.731427 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:58Z","lastTransitionTime":"2026-02-23T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.836245 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.836298 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.836315 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.836338 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.836354 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:58Z","lastTransitionTime":"2026-02-23T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.939595 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.939988 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.940009 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.940034 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:58 crc kubenswrapper[4735]: I0223 00:07:58.940053 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:58Z","lastTransitionTime":"2026-02-23T00:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.046835 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.046928 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.046948 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.046972 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.046990 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:59Z","lastTransitionTime":"2026-02-23T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.149791 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.149853 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.149897 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.149920 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.149936 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:59Z","lastTransitionTime":"2026-02-23T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.221204 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 22:44:28.51286469 +0000 UTC Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.235739 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.252888 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.252998 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.253018 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.253041 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.253057 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:59Z","lastTransitionTime":"2026-02-23T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.254137 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:59Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.271828 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:07:59 crc kubenswrapper[4735]: E0223 00:07:59.272071 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.287577 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:59Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.306944 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:59Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.329516 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:59Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.346561 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zhj4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3765b3d0261e124aa7d128e9bdf43a6c76939b5d63f3960ab38a129bf3476c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8649q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zhj4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:59Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.356395 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.356456 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.356473 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.356498 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.356516 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:59Z","lastTransitionTime":"2026-02-23T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.364224 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3ae04d2c5dac3e689b744efd8990e7210f627cf79b77aab8cdb477e22eedeef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:59Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.387221 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29209462-90c5-4aa1-9943-d8b15ac1b5a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23e62d470b47146565ead752e6fb5578b3951831fd10126ec5ba4752e6a42e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26428\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:59Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.405543 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:59Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.426288 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:59Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.444549 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:59Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.459426 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.459605 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.459629 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.459651 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.459667 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:59Z","lastTransitionTime":"2026-02-23T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.467722 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4gvxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b63c18f-b6b2-4d97-b542-7800b475bd4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4pzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4gvxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:59Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.499371 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66853c8a-9391-4291-b5f1-c72cb5fe23e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fbdabf48ee491025ac86e29f91e3da94414ebbe726d99e706b7ed31fcce58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-59rkm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:59Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.521625 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:59Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.541137 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:59Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.561155 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:59Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.563405 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.563453 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.563470 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.563492 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.563509 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:59Z","lastTransitionTime":"2026-02-23T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.588306 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-59rkm_66853c8a-9391-4291-b5f1-c72cb5fe23e8/ovnkube-controller/0.log" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.592669 4735 generic.go:334] "Generic (PLEG): container finished" podID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerID="f1fbdabf48ee491025ac86e29f91e3da94414ebbe726d99e706b7ed31fcce58c" exitCode=1 Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.592712 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" event={"ID":"66853c8a-9391-4291-b5f1-c72cb5fe23e8","Type":"ContainerDied","Data":"f1fbdabf48ee491025ac86e29f91e3da94414ebbe726d99e706b7ed31fcce58c"} Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.593582 4735 scope.go:117] "RemoveContainer" containerID="f1fbdabf48ee491025ac86e29f91e3da94414ebbe726d99e706b7ed31fcce58c" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.613966 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:59Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.636720 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:59Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.653679 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zhj4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3765b3d0261e124aa7d128e9bdf43a6c76939b5d63f3960ab38a129bf3476c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8649q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zhj4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:59Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.666597 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.666635 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.666647 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.666691 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.666704 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:59Z","lastTransitionTime":"2026-02-23T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.677745 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29209462-90c5-4aa1-9943-d8b15ac1b5a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23e62d470b47146565ead752e6fb5578b3951831fd10126ec5ba4752e6a42e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26428\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:59Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.698453 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:59Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.719990 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:59Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.732744 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:59Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.751077 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3ae04d2c5dac3e689b744efd8990e7210f627cf79b77aab8cdb477e22eedeef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:59Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.769463 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.769518 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.769535 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.769559 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.769579 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:59Z","lastTransitionTime":"2026-02-23T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.779128 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66853c8a-9391-4291-b5f1-c72cb5fe23e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1fbdabf48ee491025ac86e29f91e3da94414ebbe726d99e706b7ed31fcce58c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1fbdabf48ee491025ac86e29f91e3da94414ebbe726d99e706b7ed31fcce58c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"message\\\":\\\"ctor *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 00:07:59.067783 6045 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 00:07:59.067959 6045 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0223 00:07:59.068031 6045 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0223 00:07:59.068187 6045 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 00:07:59.068954 6045 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0223 00:07:59.069006 6045 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0223 00:07:59.069015 6045 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0223 00:07:59.069038 6045 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0223 00:07:59.069037 6045 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0223 00:07:59.069065 6045 factory.go:656] Stopping watch factory\\\\nI0223 00:07:59.069090 6045 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-59rkm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:59Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.797670 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:59Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.816609 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:59Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.831633 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:59Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.848449 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4gvxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b63c18f-b6b2-4d97-b542-7800b475bd4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4pzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4gvxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:59Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.872308 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.872349 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.872361 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.872377 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.872389 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:59Z","lastTransitionTime":"2026-02-23T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.875221 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:59Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.885404 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:07:59Z is after 2025-08-24T17:21:41Z" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.975290 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.975331 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.975340 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.975353 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:07:59 crc kubenswrapper[4735]: I0223 00:07:59.975362 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:07:59Z","lastTransitionTime":"2026-02-23T00:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.079225 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.079289 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.079309 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.079337 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.079354 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:00Z","lastTransitionTime":"2026-02-23T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.182356 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.182402 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.182414 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.182431 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.182445 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:00Z","lastTransitionTime":"2026-02-23T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.222090 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 04:19:51.924238835 +0000 UTC Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.271965 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.272102 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:08:00 crc kubenswrapper[4735]: E0223 00:08:00.272170 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:08:00 crc kubenswrapper[4735]: E0223 00:08:00.272367 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.286387 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.286445 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.286460 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.286481 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.286496 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:00Z","lastTransitionTime":"2026-02-23T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.389342 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.389404 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.389423 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.389445 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.389462 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:00Z","lastTransitionTime":"2026-02-23T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.492195 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.492251 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.492271 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.492294 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.492314 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:00Z","lastTransitionTime":"2026-02-23T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.596091 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.596163 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.596182 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.596206 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.596224 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:00Z","lastTransitionTime":"2026-02-23T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.602039 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-59rkm_66853c8a-9391-4291-b5f1-c72cb5fe23e8/ovnkube-controller/0.log" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.606444 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" event={"ID":"66853c8a-9391-4291-b5f1-c72cb5fe23e8","Type":"ContainerStarted","Data":"feb388ead34e3f72990c3f6cb25e388f77d137e16df82dc8ae239f146b7154f1"} Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.607071 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.635711 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.647516 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.663243 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.681671 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.697458 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zhj4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3765b3d0261e124aa7d128e9bdf43a6c76939b5d63f3960ab38a129bf3476c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8649q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zhj4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.699557 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.699621 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.699638 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.699659 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.699674 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:00Z","lastTransitionTime":"2026-02-23T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.713943 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.727946 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.743509 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.755721 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3ae04d2c5dac3e689b744efd8990e7210f627cf79b77aab8cdb477e22eedeef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.772892 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29209462-90c5-4aa1-9943-d8b15ac1b5a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23e62d470b47146565ead752e6fb5578b3951831fd10126ec5ba4752e6a42e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26428\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.792530 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.803081 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.803142 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.803163 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.803196 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.803215 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:00Z","lastTransitionTime":"2026-02-23T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.811971 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.828773 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.846542 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4gvxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b63c18f-b6b2-4d97-b542-7800b475bd4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4pzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4gvxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.893478 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66853c8a-9391-4291-b5f1-c72cb5fe23e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb388ead34e3f72990c3f6cb25e388f77d137e16df82dc8ae239f146b7154f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1fbdabf48ee491025ac86e29f91e3da94414ebbe726d99e706b7ed31fcce58c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"message\\\":\\\"ctor *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 00:07:59.067783 6045 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 00:07:59.067959 6045 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0223 00:07:59.068031 6045 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0223 00:07:59.068187 6045 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 00:07:59.068954 6045 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0223 00:07:59.069006 6045 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0223 00:07:59.069015 6045 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0223 00:07:59.069038 6045 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0223 00:07:59.069037 6045 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0223 00:07:59.069065 6045 factory.go:656] Stopping watch factory\\\\nI0223 00:07:59.069090 6045 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-59rkm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:00Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.906067 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.906127 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.906149 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.906176 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:00 crc kubenswrapper[4735]: I0223 00:08:00.906196 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:00Z","lastTransitionTime":"2026-02-23T00:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.009243 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.009313 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.009336 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.009369 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.009397 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:01Z","lastTransitionTime":"2026-02-23T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.113183 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.113250 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.113303 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.113330 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.113347 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:01Z","lastTransitionTime":"2026-02-23T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.216133 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.216211 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.216229 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.216257 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.216275 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:01Z","lastTransitionTime":"2026-02-23T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.222589 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 11:40:43.161675198 +0000 UTC Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.272123 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:08:01 crc kubenswrapper[4735]: E0223 00:08:01.272339 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.319957 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.320010 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.320027 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.320050 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.320068 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:01Z","lastTransitionTime":"2026-02-23T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.423135 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.423975 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.424007 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.424071 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.424097 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:01Z","lastTransitionTime":"2026-02-23T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.527574 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.527918 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.528067 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.528239 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.528367 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:01Z","lastTransitionTime":"2026-02-23T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.613098 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-59rkm_66853c8a-9391-4291-b5f1-c72cb5fe23e8/ovnkube-controller/1.log" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.614219 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-59rkm_66853c8a-9391-4291-b5f1-c72cb5fe23e8/ovnkube-controller/0.log" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.618334 4735 generic.go:334] "Generic (PLEG): container finished" podID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerID="feb388ead34e3f72990c3f6cb25e388f77d137e16df82dc8ae239f146b7154f1" exitCode=1 Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.618469 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" event={"ID":"66853c8a-9391-4291-b5f1-c72cb5fe23e8","Type":"ContainerDied","Data":"feb388ead34e3f72990c3f6cb25e388f77d137e16df82dc8ae239f146b7154f1"} Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.618553 4735 scope.go:117] "RemoveContainer" containerID="f1fbdabf48ee491025ac86e29f91e3da94414ebbe726d99e706b7ed31fcce58c" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.619538 4735 scope.go:117] "RemoveContainer" containerID="feb388ead34e3f72990c3f6cb25e388f77d137e16df82dc8ae239f146b7154f1" Feb 23 00:08:01 crc kubenswrapper[4735]: E0223 00:08:01.619812 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-59rkm_openshift-ovn-kubernetes(66853c8a-9391-4291-b5f1-c72cb5fe23e8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.631822 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.632083 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.632312 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.632610 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.633217 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:01Z","lastTransitionTime":"2026-02-23T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.639606 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zhj4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3765b3d0261e124aa7d128e9bdf43a6c76939b5d63f3960ab38a129bf3476c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8649q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zhj4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.661914 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.682431 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.700654 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.717798 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3ae04d2c5dac3e689b744efd8990e7210f627cf79b77aab8cdb477e22eedeef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.743097 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29209462-90c5-4aa1-9943-d8b15ac1b5a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23e62d470b47146565ead752e6fb5578b3951831fd10126ec5ba4752e6a42e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26428\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.745468 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.745515 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.745532 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.745557 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.745575 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:01Z","lastTransitionTime":"2026-02-23T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.765939 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.787007 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.808122 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.828930 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4gvxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b63c18f-b6b2-4d97-b542-7800b475bd4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4pzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4gvxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.848471 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.848688 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.848819 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.849000 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.849124 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:01Z","lastTransitionTime":"2026-02-23T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.857593 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66853c8a-9391-4291-b5f1-c72cb5fe23e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb388ead34e3f72990c3f6cb25e388f77d137e16df82dc8ae239f146b7154f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1fbdabf48ee491025ac86e29f91e3da94414ebbe726d99e706b7ed31fcce58c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"message\\\":\\\"ctor *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 00:07:59.067783 6045 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 00:07:59.067959 6045 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0223 00:07:59.068031 6045 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0223 00:07:59.068187 6045 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 00:07:59.068954 6045 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0223 00:07:59.069006 6045 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0223 00:07:59.069015 6045 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0223 00:07:59.069038 6045 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0223 00:07:59.069037 6045 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0223 00:07:59.069065 6045 factory.go:656] Stopping watch factory\\\\nI0223 00:07:59.069090 6045 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb388ead34e3f72990c3f6cb25e388f77d137e16df82dc8ae239f146b7154f1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T00:08:00Z\\\",\\\"message\\\":\\\" 6 for removal\\\\nI0223 00:08:00.703109 6175 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0223 00:08:00.703066 6175 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 00:08:00.703206 6175 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0223 00:08:00.703283 6175 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0223 00:08:00.703382 6175 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0223 00:08:00.703454 6175 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0223 00:08:00.703472 6175 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0223 00:08:00.703502 6175 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0223 00:08:00.703522 6175 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0223 00:08:00.703503 6175 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0223 00:08:00.703546 6175 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0223 00:08:00.703552 6175 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0223 00:08:00.703566 6175 factory.go:656] Stopping watch factory\\\\nI0223 00:08:00.703580 6175 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0223 00:08:00.703589 6175 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-59rkm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.880089 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.901105 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.917651 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.949201 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:01Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.952501 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.952548 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.952561 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.952579 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:01 crc kubenswrapper[4735]: I0223 00:08:01.952595 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:01Z","lastTransitionTime":"2026-02-23T00:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.055838 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.055952 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.055977 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.056012 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.056030 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:02Z","lastTransitionTime":"2026-02-23T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.159945 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.160083 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.160103 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.160168 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.160190 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:02Z","lastTransitionTime":"2026-02-23T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.223754 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 14:09:31.558169772 +0000 UTC Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.263376 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.263427 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.263436 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.263455 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.263468 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:02Z","lastTransitionTime":"2026-02-23T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.271475 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:08:02 crc kubenswrapper[4735]: E0223 00:08:02.271665 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.271827 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:08:02 crc kubenswrapper[4735]: E0223 00:08:02.272269 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.299586 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:02Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.335579 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:02Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.360635 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:02Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.367437 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.367474 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.367487 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.367509 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.367520 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:02Z","lastTransitionTime":"2026-02-23T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.381964 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3ae04d2c5dac3e689b744efd8990e7210f627cf79b77aab8cdb477e22eedeef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:02Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.403563 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29209462-90c5-4aa1-9943-d8b15ac1b5a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23e62d470b47146565ead752e6fb5578b3951831fd10126ec5ba4752e6a42e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26428\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:02Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.421391 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:02Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.435513 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:02Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.451056 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:02Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.467712 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4gvxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b63c18f-b6b2-4d97-b542-7800b475bd4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4pzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4gvxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:02Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.469745 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.469804 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.469828 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.469899 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.469925 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:02Z","lastTransitionTime":"2026-02-23T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.493758 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66853c8a-9391-4291-b5f1-c72cb5fe23e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb388ead34e3f72990c3f6cb25e388f77d137e16df82dc8ae239f146b7154f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1fbdabf48ee491025ac86e29f91e3da94414ebbe726d99e706b7ed31fcce58c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"message\\\":\\\"ctor *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 00:07:59.067783 6045 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 00:07:59.067959 6045 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0223 00:07:59.068031 6045 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0223 00:07:59.068187 6045 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 00:07:59.068954 6045 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0223 00:07:59.069006 6045 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0223 00:07:59.069015 6045 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0223 00:07:59.069038 6045 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0223 00:07:59.069037 6045 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0223 00:07:59.069065 6045 factory.go:656] Stopping watch factory\\\\nI0223 00:07:59.069090 6045 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb388ead34e3f72990c3f6cb25e388f77d137e16df82dc8ae239f146b7154f1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T00:08:00Z\\\",\\\"message\\\":\\\" 6 for removal\\\\nI0223 00:08:00.703109 6175 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0223 00:08:00.703066 6175 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 00:08:00.703206 6175 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0223 00:08:00.703283 6175 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0223 00:08:00.703382 6175 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0223 00:08:00.703454 6175 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0223 00:08:00.703472 6175 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0223 00:08:00.703502 6175 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0223 00:08:00.703522 6175 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0223 00:08:00.703503 6175 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0223 00:08:00.703546 6175 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0223 00:08:00.703552 6175 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0223 00:08:00.703566 6175 factory.go:656] Stopping watch factory\\\\nI0223 00:08:00.703580 6175 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0223 00:08:00.703589 6175 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-59rkm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:02Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.525251 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:02Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.539133 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:02Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.558113 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:02Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.572826 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.572915 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.572934 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.572958 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.572975 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:02Z","lastTransitionTime":"2026-02-23T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.578555 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:02Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.593464 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zhj4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3765b3d0261e124aa7d128e9bdf43a6c76939b5d63f3960ab38a129bf3476c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8649q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zhj4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:02Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.624424 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-59rkm_66853c8a-9391-4291-b5f1-c72cb5fe23e8/ovnkube-controller/1.log" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.629663 4735 scope.go:117] "RemoveContainer" containerID="feb388ead34e3f72990c3f6cb25e388f77d137e16df82dc8ae239f146b7154f1" Feb 23 00:08:02 crc kubenswrapper[4735]: E0223 00:08:02.630043 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-59rkm_openshift-ovn-kubernetes(66853c8a-9391-4291-b5f1-c72cb5fe23e8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.650117 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:02Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.670012 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:02Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.676370 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.676431 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.676451 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.676476 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.676495 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:02Z","lastTransitionTime":"2026-02-23T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.687339 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zhj4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3765b3d0261e124aa7d128e9bdf43a6c76939b5d63f3960ab38a129bf3476c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8649q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zhj4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:02Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.706680 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:02Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.727423 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:02Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.744907 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:02Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.761809 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3ae04d2c5dac3e689b744efd8990e7210f627cf79b77aab8cdb477e22eedeef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:02Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.779388 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.779445 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.779463 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.779488 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.779505 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:02Z","lastTransitionTime":"2026-02-23T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.785075 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29209462-90c5-4aa1-9943-d8b15ac1b5a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23e62d470b47146565ead752e6fb5578b3951831fd10126ec5ba4752e6a42e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26428\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:02Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.806344 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:02Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.824825 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:02Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.843714 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:02Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.871837 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zmdgs"] Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.872475 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zmdgs" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.873185 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4gvxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b63c18f-b6b2-4d97-b542-7800b475bd4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4pzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4gvxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:02Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.877327 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.877420 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.881923 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.881983 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.882046 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.882069 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.882087 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:02Z","lastTransitionTime":"2026-02-23T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.909881 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66853c8a-9391-4291-b5f1-c72cb5fe23e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb388ead34e3f72990c3f6cb25e388f77d137e16df82dc8ae239f146b7154f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb388ead34e3f72990c3f6cb25e388f77d137e16df82dc8ae239f146b7154f1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T00:08:00Z\\\",\\\"message\\\":\\\" 6 for removal\\\\nI0223 00:08:00.703109 6175 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0223 00:08:00.703066 6175 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 00:08:00.703206 6175 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0223 00:08:00.703283 6175 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0223 00:08:00.703382 6175 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0223 00:08:00.703454 6175 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0223 00:08:00.703472 6175 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0223 00:08:00.703502 6175 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0223 00:08:00.703522 6175 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0223 00:08:00.703503 6175 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0223 00:08:00.703546 6175 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0223 00:08:00.703552 6175 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0223 00:08:00.703566 6175 factory.go:656] Stopping watch factory\\\\nI0223 00:08:00.703580 6175 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0223 00:08:00.703589 6175 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-59rkm_openshift-ovn-kubernetes(66853c8a-9391-4291-b5f1-c72cb5fe23e8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-59rkm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:02Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.939158 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6c844ff7-f40a-43c5-bf50-dae480a77428-env-overrides\") pod \"ovnkube-control-plane-749d76644c-zmdgs\" (UID: \"6c844ff7-f40a-43c5-bf50-dae480a77428\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zmdgs" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.939224 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6c844ff7-f40a-43c5-bf50-dae480a77428-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-zmdgs\" (UID: \"6c844ff7-f40a-43c5-bf50-dae480a77428\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zmdgs" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.939292 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6c844ff7-f40a-43c5-bf50-dae480a77428-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-zmdgs\" (UID: \"6c844ff7-f40a-43c5-bf50-dae480a77428\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zmdgs" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.939493 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52p6d\" (UniqueName: \"kubernetes.io/projected/6c844ff7-f40a-43c5-bf50-dae480a77428-kube-api-access-52p6d\") pod \"ovnkube-control-plane-749d76644c-zmdgs\" (UID: \"6c844ff7-f40a-43c5-bf50-dae480a77428\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zmdgs" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.942428 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:02Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.959148 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:02Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.984834 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.985072 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.985285 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.985495 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.985682 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:02Z","lastTransitionTime":"2026-02-23T00:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:02 crc kubenswrapper[4735]: I0223 00:08:02.992252 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:02Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.008626 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:03Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.027482 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:03Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.040514 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6c844ff7-f40a-43c5-bf50-dae480a77428-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-zmdgs\" (UID: \"6c844ff7-f40a-43c5-bf50-dae480a77428\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zmdgs" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.040633 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52p6d\" (UniqueName: \"kubernetes.io/projected/6c844ff7-f40a-43c5-bf50-dae480a77428-kube-api-access-52p6d\") pod \"ovnkube-control-plane-749d76644c-zmdgs\" (UID: \"6c844ff7-f40a-43c5-bf50-dae480a77428\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zmdgs" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.040701 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6c844ff7-f40a-43c5-bf50-dae480a77428-env-overrides\") pod \"ovnkube-control-plane-749d76644c-zmdgs\" (UID: \"6c844ff7-f40a-43c5-bf50-dae480a77428\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zmdgs" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.041055 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6c844ff7-f40a-43c5-bf50-dae480a77428-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-zmdgs\" (UID: \"6c844ff7-f40a-43c5-bf50-dae480a77428\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zmdgs" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.042527 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6c844ff7-f40a-43c5-bf50-dae480a77428-env-overrides\") pod \"ovnkube-control-plane-749d76644c-zmdgs\" (UID: \"6c844ff7-f40a-43c5-bf50-dae480a77428\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zmdgs" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.043177 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6c844ff7-f40a-43c5-bf50-dae480a77428-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-zmdgs\" (UID: \"6c844ff7-f40a-43c5-bf50-dae480a77428\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zmdgs" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.048286 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:03Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.054688 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6c844ff7-f40a-43c5-bf50-dae480a77428-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-zmdgs\" (UID: \"6c844ff7-f40a-43c5-bf50-dae480a77428\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zmdgs" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.069018 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zhj4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3765b3d0261e124aa7d128e9bdf43a6c76939b5d63f3960ab38a129bf3476c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8649q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zhj4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:03Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.070420 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52p6d\" (UniqueName: \"kubernetes.io/projected/6c844ff7-f40a-43c5-bf50-dae480a77428-kube-api-access-52p6d\") pod \"ovnkube-control-plane-749d76644c-zmdgs\" (UID: \"6c844ff7-f40a-43c5-bf50-dae480a77428\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zmdgs" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.088741 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zmdgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c844ff7-f40a-43c5-bf50-dae480a77428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52p6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52p6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:08:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zmdgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:03Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.089245 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.089302 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.089321 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.089344 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.089362 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:03Z","lastTransitionTime":"2026-02-23T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.109254 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:03Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.129577 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:03Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.147681 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:03Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.164179 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3ae04d2c5dac3e689b744efd8990e7210f627cf79b77aab8cdb477e22eedeef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:03Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.188589 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29209462-90c5-4aa1-9943-d8b15ac1b5a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23e62d470b47146565ead752e6fb5578b3951831fd10126ec5ba4752e6a42e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26428\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:03Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.192931 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.192991 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.193008 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.193032 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.193075 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:03Z","lastTransitionTime":"2026-02-23T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.195956 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zmdgs" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.213632 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:03Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.223918 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 21:55:01.150297927 +0000 UTC Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.234136 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:03Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.252797 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:03Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.271945 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.272422 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4gvxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b63c18f-b6b2-4d97-b542-7800b475bd4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4pzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4gvxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:03Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:03 crc kubenswrapper[4735]: E0223 00:08:03.272983 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.296698 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.296772 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.296789 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.296818 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.296838 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:03Z","lastTransitionTime":"2026-02-23T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.305691 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66853c8a-9391-4291-b5f1-c72cb5fe23e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb388ead34e3f72990c3f6cb25e388f77d137e16df82dc8ae239f146b7154f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb388ead34e3f72990c3f6cb25e388f77d137e16df82dc8ae239f146b7154f1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T00:08:00Z\\\",\\\"message\\\":\\\" 6 for removal\\\\nI0223 00:08:00.703109 6175 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0223 00:08:00.703066 6175 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 00:08:00.703206 6175 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0223 00:08:00.703283 6175 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0223 00:08:00.703382 6175 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0223 00:08:00.703454 6175 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0223 00:08:00.703472 6175 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0223 00:08:00.703502 6175 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0223 00:08:00.703522 6175 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0223 00:08:00.703503 6175 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0223 00:08:00.703546 6175 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0223 00:08:00.703552 6175 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0223 00:08:00.703566 6175 factory.go:656] Stopping watch factory\\\\nI0223 00:08:00.703580 6175 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0223 00:08:00.703589 6175 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-59rkm_openshift-ovn-kubernetes(66853c8a-9391-4291-b5f1-c72cb5fe23e8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-59rkm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:03Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.401243 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.401322 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.401342 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.401373 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.401400 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:03Z","lastTransitionTime":"2026-02-23T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.504112 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.504160 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.504180 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.504228 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.504247 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:03Z","lastTransitionTime":"2026-02-23T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.606537 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.606576 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.606588 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.606605 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.606616 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:03Z","lastTransitionTime":"2026-02-23T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.635687 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zmdgs" event={"ID":"6c844ff7-f40a-43c5-bf50-dae480a77428","Type":"ContainerStarted","Data":"1e72aa9d3da2d55c67384b2de4aaab9402eff6a464c662ca173f18affd95f4a3"} Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.635757 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zmdgs" event={"ID":"6c844ff7-f40a-43c5-bf50-dae480a77428","Type":"ContainerStarted","Data":"ae14a0327db69c9cd0b4cb514bee921487ce35aaeb18535d8d44bc172fcdf59d"} Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.635779 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zmdgs" event={"ID":"6c844ff7-f40a-43c5-bf50-dae480a77428","Type":"ContainerStarted","Data":"52130a8237634bc00a2efb758b5a12f68799b4a935154ee27cfa2c36d4cdcc24"} Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.658433 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:03Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.684791 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:03Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.698708 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zhj4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3765b3d0261e124aa7d128e9bdf43a6c76939b5d63f3960ab38a129bf3476c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8649q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zhj4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:03Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.709906 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.709980 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.710007 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.710056 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.710079 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:03Z","lastTransitionTime":"2026-02-23T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.717326 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zmdgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c844ff7-f40a-43c5-bf50-dae480a77428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae14a0327db69c9cd0b4cb514bee921487ce35aaeb18535d8d44bc172fcdf59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52p6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e72aa9d3da2d55c67384b2de4aaab9402eff6a464c662ca173f18affd95f4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52p6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:08:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zmdgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:03Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.742462 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:03Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.764943 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:03Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.782721 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:03Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.798463 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3ae04d2c5dac3e689b744efd8990e7210f627cf79b77aab8cdb477e22eedeef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:03Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.813444 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.813490 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.813508 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.813533 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.813549 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:03Z","lastTransitionTime":"2026-02-23T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.823603 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29209462-90c5-4aa1-9943-d8b15ac1b5a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23e62d470b47146565ead752e6fb5578b3951831fd10126ec5ba4752e6a42e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26428\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:03Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.845279 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:03Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.862475 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:03Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.879507 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:03Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.896400 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4gvxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b63c18f-b6b2-4d97-b542-7800b475bd4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4pzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4gvxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:03Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.916101 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.916157 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.916169 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.916189 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.916201 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:03Z","lastTransitionTime":"2026-02-23T00:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.928370 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66853c8a-9391-4291-b5f1-c72cb5fe23e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb388ead34e3f72990c3f6cb25e388f77d137e16df82dc8ae239f146b7154f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb388ead34e3f72990c3f6cb25e388f77d137e16df82dc8ae239f146b7154f1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T00:08:00Z\\\",\\\"message\\\":\\\" 6 for removal\\\\nI0223 00:08:00.703109 6175 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0223 00:08:00.703066 6175 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 00:08:00.703206 6175 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0223 00:08:00.703283 6175 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0223 00:08:00.703382 6175 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0223 00:08:00.703454 6175 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0223 00:08:00.703472 6175 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0223 00:08:00.703502 6175 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0223 00:08:00.703522 6175 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0223 00:08:00.703503 6175 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0223 00:08:00.703546 6175 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0223 00:08:00.703552 6175 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0223 00:08:00.703566 6175 factory.go:656] Stopping watch factory\\\\nI0223 00:08:00.703580 6175 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0223 00:08:00.703589 6175 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-59rkm_openshift-ovn-kubernetes(66853c8a-9391-4291-b5f1-c72cb5fe23e8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-59rkm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:03Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.953575 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:03Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:03 crc kubenswrapper[4735]: I0223 00:08:03.970733 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:03Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.003669 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-bdqfd"] Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.004358 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:04 crc kubenswrapper[4735]: E0223 00:08:04.004455 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.018748 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.018796 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.018815 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.018838 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.018882 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:04Z","lastTransitionTime":"2026-02-23T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.024574 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:04Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.038891 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:04Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.052962 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:04Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.053600 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b542cb9e-35cc-44d9-a850-c41887636c4c-metrics-certs\") pod \"network-metrics-daemon-bdqfd\" (UID: \"b542cb9e-35cc-44d9-a850-c41887636c4c\") " pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.053677 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8qrl\" (UniqueName: \"kubernetes.io/projected/b542cb9e-35cc-44d9-a850-c41887636c4c-kube-api-access-k8qrl\") pod \"network-metrics-daemon-bdqfd\" (UID: \"b542cb9e-35cc-44d9-a850-c41887636c4c\") " pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.070705 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3ae04d2c5dac3e689b744efd8990e7210f627cf79b77aab8cdb477e22eedeef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:04Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.093665 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29209462-90c5-4aa1-9943-d8b15ac1b5a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23e62d470b47146565ead752e6fb5578b3951831fd10126ec5ba4752e6a42e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26428\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:04Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.110847 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:04Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.122248 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.122308 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.122325 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.122351 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.122368 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:04Z","lastTransitionTime":"2026-02-23T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.133543 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:04Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.152799 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:04Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.155292 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b542cb9e-35cc-44d9-a850-c41887636c4c-metrics-certs\") pod \"network-metrics-daemon-bdqfd\" (UID: \"b542cb9e-35cc-44d9-a850-c41887636c4c\") " pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.155350 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8qrl\" (UniqueName: \"kubernetes.io/projected/b542cb9e-35cc-44d9-a850-c41887636c4c-kube-api-access-k8qrl\") pod \"network-metrics-daemon-bdqfd\" (UID: \"b542cb9e-35cc-44d9-a850-c41887636c4c\") " pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:04 crc kubenswrapper[4735]: E0223 00:08:04.155476 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 00:08:04 crc kubenswrapper[4735]: E0223 00:08:04.155583 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b542cb9e-35cc-44d9-a850-c41887636c4c-metrics-certs podName:b542cb9e-35cc-44d9-a850-c41887636c4c nodeName:}" failed. No retries permitted until 2026-02-23 00:08:04.655556128 +0000 UTC m=+43.119102139 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b542cb9e-35cc-44d9-a850-c41887636c4c-metrics-certs") pod "network-metrics-daemon-bdqfd" (UID: "b542cb9e-35cc-44d9-a850-c41887636c4c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.173768 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4gvxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b63c18f-b6b2-4d97-b542-7800b475bd4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4pzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4gvxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:04Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.187697 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8qrl\" (UniqueName: \"kubernetes.io/projected/b542cb9e-35cc-44d9-a850-c41887636c4c-kube-api-access-k8qrl\") pod \"network-metrics-daemon-bdqfd\" (UID: \"b542cb9e-35cc-44d9-a850-c41887636c4c\") " pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.206126 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66853c8a-9391-4291-b5f1-c72cb5fe23e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb388ead34e3f72990c3f6cb25e388f77d137e16df82dc8ae239f146b7154f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb388ead34e3f72990c3f6cb25e388f77d137e16df82dc8ae239f146b7154f1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T00:08:00Z\\\",\\\"message\\\":\\\" 6 for removal\\\\nI0223 00:08:00.703109 6175 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0223 00:08:00.703066 6175 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 00:08:00.703206 6175 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0223 00:08:00.703283 6175 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0223 00:08:00.703382 6175 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0223 00:08:00.703454 6175 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0223 00:08:00.703472 6175 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0223 00:08:00.703502 6175 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0223 00:08:00.703522 6175 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0223 00:08:00.703503 6175 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0223 00:08:00.703546 6175 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0223 00:08:00.703552 6175 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0223 00:08:00.703566 6175 factory.go:656] Stopping watch factory\\\\nI0223 00:08:00.703580 6175 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0223 00:08:00.703589 6175 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-59rkm_openshift-ovn-kubernetes(66853c8a-9391-4291-b5f1-c72cb5fe23e8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-59rkm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:04Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.224051 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 18:00:17.625591397 +0000 UTC Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.225702 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.225732 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.225740 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.225770 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.225781 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:04Z","lastTransitionTime":"2026-02-23T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.241168 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:04Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.257904 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:04Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.271429 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.271483 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:08:04 crc kubenswrapper[4735]: E0223 00:08:04.271588 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:08:04 crc kubenswrapper[4735]: E0223 00:08:04.271766 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.279024 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:04Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.297428 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:04Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.313064 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zhj4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3765b3d0261e124aa7d128e9bdf43a6c76939b5d63f3960ab38a129bf3476c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8649q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zhj4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:04Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.329114 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.329163 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.329181 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.329205 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.329223 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:04Z","lastTransitionTime":"2026-02-23T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.330538 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zmdgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c844ff7-f40a-43c5-bf50-dae480a77428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae14a0327db69c9cd0b4cb514bee921487ce35aaeb18535d8d44bc172fcdf59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52p6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e72aa9d3da2d55c67384b2de4aaab9402eff6a464c662ca173f18affd95f4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52p6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:08:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zmdgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:04Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.346405 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bdqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b542cb9e-35cc-44d9-a850-c41887636c4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8qrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8qrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:08:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bdqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:04Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.432221 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.432285 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.432309 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.432341 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.432361 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:04Z","lastTransitionTime":"2026-02-23T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.535280 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.535340 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.535399 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.535500 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.535519 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:04Z","lastTransitionTime":"2026-02-23T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.643144 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.643454 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.643480 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.643498 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.643510 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:04Z","lastTransitionTime":"2026-02-23T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.660150 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b542cb9e-35cc-44d9-a850-c41887636c4c-metrics-certs\") pod \"network-metrics-daemon-bdqfd\" (UID: \"b542cb9e-35cc-44d9-a850-c41887636c4c\") " pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:04 crc kubenswrapper[4735]: E0223 00:08:04.660342 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 00:08:04 crc kubenswrapper[4735]: E0223 00:08:04.660434 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b542cb9e-35cc-44d9-a850-c41887636c4c-metrics-certs podName:b542cb9e-35cc-44d9-a850-c41887636c4c nodeName:}" failed. No retries permitted until 2026-02-23 00:08:05.66041332 +0000 UTC m=+44.123959301 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b542cb9e-35cc-44d9-a850-c41887636c4c-metrics-certs") pod "network-metrics-daemon-bdqfd" (UID: "b542cb9e-35cc-44d9-a850-c41887636c4c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.747052 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.747104 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.747122 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.747144 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.747160 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:04Z","lastTransitionTime":"2026-02-23T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.850328 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.850387 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.850404 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.850428 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.850445 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:04Z","lastTransitionTime":"2026-02-23T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.953705 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.953761 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.953778 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.953802 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:04 crc kubenswrapper[4735]: I0223 00:08:04.953819 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:04Z","lastTransitionTime":"2026-02-23T00:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.056215 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.056259 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.056278 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.056300 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.056316 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:05Z","lastTransitionTime":"2026-02-23T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.159730 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.159796 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.159813 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.159838 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.159884 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:05Z","lastTransitionTime":"2026-02-23T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.224617 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 19:08:46.646939409 +0000 UTC Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.265638 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.265703 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.265722 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.265744 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.265760 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:05Z","lastTransitionTime":"2026-02-23T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.271571 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:05 crc kubenswrapper[4735]: E0223 00:08:05.271786 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.272297 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:08:05 crc kubenswrapper[4735]: E0223 00:08:05.272666 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.368396 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.368949 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.369112 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.369241 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.369357 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:05Z","lastTransitionTime":"2026-02-23T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.472170 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.472232 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.472249 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.472272 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.472289 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:05Z","lastTransitionTime":"2026-02-23T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.575399 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.575477 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.575494 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.575518 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.575536 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:05Z","lastTransitionTime":"2026-02-23T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.672823 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b542cb9e-35cc-44d9-a850-c41887636c4c-metrics-certs\") pod \"network-metrics-daemon-bdqfd\" (UID: \"b542cb9e-35cc-44d9-a850-c41887636c4c\") " pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:05 crc kubenswrapper[4735]: E0223 00:08:05.673124 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 00:08:05 crc kubenswrapper[4735]: E0223 00:08:05.673254 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b542cb9e-35cc-44d9-a850-c41887636c4c-metrics-certs podName:b542cb9e-35cc-44d9-a850-c41887636c4c nodeName:}" failed. No retries permitted until 2026-02-23 00:08:07.673219583 +0000 UTC m=+46.136765594 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b542cb9e-35cc-44d9-a850-c41887636c4c-metrics-certs") pod "network-metrics-daemon-bdqfd" (UID: "b542cb9e-35cc-44d9-a850-c41887636c4c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.679385 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.679431 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.679448 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.679471 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.679488 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:05Z","lastTransitionTime":"2026-02-23T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.782497 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.782545 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.782566 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.782587 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.782604 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:05Z","lastTransitionTime":"2026-02-23T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.885390 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.885470 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.885494 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.885524 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.885554 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:05Z","lastTransitionTime":"2026-02-23T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.988674 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.988734 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.988752 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.988778 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:05 crc kubenswrapper[4735]: I0223 00:08:05.988796 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:05Z","lastTransitionTime":"2026-02-23T00:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.091439 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.091507 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.091531 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.091565 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.091588 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:06Z","lastTransitionTime":"2026-02-23T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.194798 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.194895 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.194924 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.194950 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.194997 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:06Z","lastTransitionTime":"2026-02-23T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.225023 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 13:50:12.464356219 +0000 UTC Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.271885 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.271914 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:08:06 crc kubenswrapper[4735]: E0223 00:08:06.272091 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:08:06 crc kubenswrapper[4735]: E0223 00:08:06.272196 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.298238 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.298303 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.298322 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.298346 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.298364 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:06Z","lastTransitionTime":"2026-02-23T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.400834 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.400929 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.400952 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.400984 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.401016 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:06Z","lastTransitionTime":"2026-02-23T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.504453 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.504520 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.504536 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.504560 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.504577 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:06Z","lastTransitionTime":"2026-02-23T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.607521 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.607593 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.607615 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.607644 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.607666 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:06Z","lastTransitionTime":"2026-02-23T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.710339 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.710417 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.710441 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.710472 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.710496 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:06Z","lastTransitionTime":"2026-02-23T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.813971 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.814026 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.814042 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.814064 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.814081 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:06Z","lastTransitionTime":"2026-02-23T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.916928 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.917020 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.917038 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.917061 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:06 crc kubenswrapper[4735]: I0223 00:08:06.917078 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:06Z","lastTransitionTime":"2026-02-23T00:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.020095 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.020164 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.020182 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.020208 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.020226 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:07Z","lastTransitionTime":"2026-02-23T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.123634 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.123665 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.123673 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.123685 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.123694 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:07Z","lastTransitionTime":"2026-02-23T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.225124 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 00:42:36.391164145 +0000 UTC Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.226344 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.226445 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.226472 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.226503 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.226526 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:07Z","lastTransitionTime":"2026-02-23T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.271835 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.271919 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:08:07 crc kubenswrapper[4735]: E0223 00:08:07.272009 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:08:07 crc kubenswrapper[4735]: E0223 00:08:07.272060 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.329518 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.329560 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.329576 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.329599 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.329616 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:07Z","lastTransitionTime":"2026-02-23T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.439886 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.439959 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.439983 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.440014 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.440051 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:07Z","lastTransitionTime":"2026-02-23T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.542802 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.542896 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.542919 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.542949 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.542969 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:07Z","lastTransitionTime":"2026-02-23T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.646586 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.646655 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.646673 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.646701 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.646720 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:07Z","lastTransitionTime":"2026-02-23T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.693289 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b542cb9e-35cc-44d9-a850-c41887636c4c-metrics-certs\") pod \"network-metrics-daemon-bdqfd\" (UID: \"b542cb9e-35cc-44d9-a850-c41887636c4c\") " pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:07 crc kubenswrapper[4735]: E0223 00:08:07.693587 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 00:08:07 crc kubenswrapper[4735]: E0223 00:08:07.693710 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b542cb9e-35cc-44d9-a850-c41887636c4c-metrics-certs podName:b542cb9e-35cc-44d9-a850-c41887636c4c nodeName:}" failed. No retries permitted until 2026-02-23 00:08:11.693679177 +0000 UTC m=+50.157225188 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b542cb9e-35cc-44d9-a850-c41887636c4c-metrics-certs") pod "network-metrics-daemon-bdqfd" (UID: "b542cb9e-35cc-44d9-a850-c41887636c4c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.750328 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.750418 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.750438 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.750462 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.750479 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:07Z","lastTransitionTime":"2026-02-23T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.853452 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.853515 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.853536 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.853560 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.853580 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:07Z","lastTransitionTime":"2026-02-23T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.956961 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.957022 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.957044 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.957075 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:07 crc kubenswrapper[4735]: I0223 00:08:07.957095 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:07Z","lastTransitionTime":"2026-02-23T00:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.060238 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.060289 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.060306 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.060330 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.060346 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:08Z","lastTransitionTime":"2026-02-23T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.163357 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.163420 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.163444 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.163475 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.163497 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:08Z","lastTransitionTime":"2026-02-23T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.226283 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 17:08:15.117700722 +0000 UTC Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.266396 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.266499 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.266519 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.266544 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.266562 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:08Z","lastTransitionTime":"2026-02-23T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.271462 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.271495 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:08:08 crc kubenswrapper[4735]: E0223 00:08:08.272106 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:08:08 crc kubenswrapper[4735]: E0223 00:08:08.272173 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.369826 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.369945 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.369963 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.369989 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.370012 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:08Z","lastTransitionTime":"2026-02-23T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.474327 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.474832 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.475018 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.475192 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.475333 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:08Z","lastTransitionTime":"2026-02-23T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.578031 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.578081 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.578098 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.578121 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.578139 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:08Z","lastTransitionTime":"2026-02-23T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.654355 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.654433 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.654452 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.654481 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.654501 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:08Z","lastTransitionTime":"2026-02-23T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:08 crc kubenswrapper[4735]: E0223 00:08:08.676595 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc670e79-a4c0-4f94-a41e-9a217a93a98f\\\",\\\"systemUUID\\\":\\\"34aea2d1-4777-4ec2-a0dd-7ee942962cf5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:08Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.682272 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.682373 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.682394 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.682458 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.682481 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:08Z","lastTransitionTime":"2026-02-23T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:08 crc kubenswrapper[4735]: E0223 00:08:08.703214 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc670e79-a4c0-4f94-a41e-9a217a93a98f\\\",\\\"systemUUID\\\":\\\"34aea2d1-4777-4ec2-a0dd-7ee942962cf5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:08Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.708550 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.708618 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.708646 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.708678 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.708702 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:08Z","lastTransitionTime":"2026-02-23T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:08 crc kubenswrapper[4735]: E0223 00:08:08.728316 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc670e79-a4c0-4f94-a41e-9a217a93a98f\\\",\\\"systemUUID\\\":\\\"34aea2d1-4777-4ec2-a0dd-7ee942962cf5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:08Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.732892 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.732959 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.732983 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.733013 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.733036 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:08Z","lastTransitionTime":"2026-02-23T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:08 crc kubenswrapper[4735]: E0223 00:08:08.753247 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc670e79-a4c0-4f94-a41e-9a217a93a98f\\\",\\\"systemUUID\\\":\\\"34aea2d1-4777-4ec2-a0dd-7ee942962cf5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:08Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.759128 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.759204 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.759228 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.759259 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.759280 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:08Z","lastTransitionTime":"2026-02-23T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:08 crc kubenswrapper[4735]: E0223 00:08:08.780334 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc670e79-a4c0-4f94-a41e-9a217a93a98f\\\",\\\"systemUUID\\\":\\\"34aea2d1-4777-4ec2-a0dd-7ee942962cf5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:08Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:08 crc kubenswrapper[4735]: E0223 00:08:08.780499 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.784186 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.784366 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.784437 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.784468 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.784489 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:08Z","lastTransitionTime":"2026-02-23T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.887968 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.888062 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.888082 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.888115 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.888137 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:08Z","lastTransitionTime":"2026-02-23T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.992415 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.992533 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.992552 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.992583 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:08 crc kubenswrapper[4735]: I0223 00:08:08.992603 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:08Z","lastTransitionTime":"2026-02-23T00:08:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.096560 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.096640 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.096663 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.096697 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.096719 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:09Z","lastTransitionTime":"2026-02-23T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.199993 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.200073 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.200095 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.200130 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.200154 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:09Z","lastTransitionTime":"2026-02-23T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.226759 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 23:35:45.156759363 +0000 UTC Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.271279 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.271349 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:08:09 crc kubenswrapper[4735]: E0223 00:08:09.271535 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:08:09 crc kubenswrapper[4735]: E0223 00:08:09.271660 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.304383 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.304457 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.304481 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.304509 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.304526 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:09Z","lastTransitionTime":"2026-02-23T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.408125 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.408196 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.408213 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.408240 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.408257 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:09Z","lastTransitionTime":"2026-02-23T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.511614 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.511679 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.511698 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.511726 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.511745 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:09Z","lastTransitionTime":"2026-02-23T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.615553 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.615619 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.615636 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.615659 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.615676 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:09Z","lastTransitionTime":"2026-02-23T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.719234 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.719297 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.719313 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.719336 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.719356 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:09Z","lastTransitionTime":"2026-02-23T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.822228 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.822295 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.822313 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.822340 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.822363 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:09Z","lastTransitionTime":"2026-02-23T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.924702 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.924845 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.924907 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.924934 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:09 crc kubenswrapper[4735]: I0223 00:08:09.924953 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:09Z","lastTransitionTime":"2026-02-23T00:08:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.028548 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.028608 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.028625 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.028648 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.028670 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:10Z","lastTransitionTime":"2026-02-23T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.131720 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.131793 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.131815 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.131843 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.131898 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:10Z","lastTransitionTime":"2026-02-23T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.227026 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 09:59:20.103781014 +0000 UTC Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.234628 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.234678 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.234695 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.234717 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.234734 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:10Z","lastTransitionTime":"2026-02-23T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.271271 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:08:10 crc kubenswrapper[4735]: E0223 00:08:10.271464 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.272126 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:08:10 crc kubenswrapper[4735]: E0223 00:08:10.272423 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.337389 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.337455 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.337474 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.337498 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.337517 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:10Z","lastTransitionTime":"2026-02-23T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.440911 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.440998 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.441018 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.441048 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.441096 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:10Z","lastTransitionTime":"2026-02-23T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.545441 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.545554 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.545578 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.545617 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.545647 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:10Z","lastTransitionTime":"2026-02-23T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.649811 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.649899 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.649917 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.649944 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.649964 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:10Z","lastTransitionTime":"2026-02-23T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.752943 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.753013 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.753038 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.753068 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.753093 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:10Z","lastTransitionTime":"2026-02-23T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.857198 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.857270 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.857287 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.857315 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.857335 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:10Z","lastTransitionTime":"2026-02-23T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.960311 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.960391 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.960412 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.960446 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:10 crc kubenswrapper[4735]: I0223 00:08:10.960468 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:10Z","lastTransitionTime":"2026-02-23T00:08:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.064079 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.064164 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.064183 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.064213 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.064233 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:11Z","lastTransitionTime":"2026-02-23T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.167150 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.167233 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.167252 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.167284 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.167307 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:11Z","lastTransitionTime":"2026-02-23T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.228144 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 19:30:58.515924823 +0000 UTC Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.270438 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.270484 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.270495 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.270511 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.270522 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:11Z","lastTransitionTime":"2026-02-23T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.271494 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.271553 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:11 crc kubenswrapper[4735]: E0223 00:08:11.271640 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:08:11 crc kubenswrapper[4735]: E0223 00:08:11.271807 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.373976 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.374046 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.374065 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.374095 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.374121 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:11Z","lastTransitionTime":"2026-02-23T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.476997 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.477095 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.477138 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.477164 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.477180 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:11Z","lastTransitionTime":"2026-02-23T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.580843 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.580935 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.580953 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.580985 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.581008 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:11Z","lastTransitionTime":"2026-02-23T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.685029 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.685120 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.685145 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.685188 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.685214 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:11Z","lastTransitionTime":"2026-02-23T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.742411 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b542cb9e-35cc-44d9-a850-c41887636c4c-metrics-certs\") pod \"network-metrics-daemon-bdqfd\" (UID: \"b542cb9e-35cc-44d9-a850-c41887636c4c\") " pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:11 crc kubenswrapper[4735]: E0223 00:08:11.742736 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 00:08:11 crc kubenswrapper[4735]: E0223 00:08:11.742925 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b542cb9e-35cc-44d9-a850-c41887636c4c-metrics-certs podName:b542cb9e-35cc-44d9-a850-c41887636c4c nodeName:}" failed. No retries permitted until 2026-02-23 00:08:19.742842025 +0000 UTC m=+58.206388146 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b542cb9e-35cc-44d9-a850-c41887636c4c-metrics-certs") pod "network-metrics-daemon-bdqfd" (UID: "b542cb9e-35cc-44d9-a850-c41887636c4c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.788293 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.788338 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.788349 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.788365 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.788378 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:11Z","lastTransitionTime":"2026-02-23T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.892030 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.892107 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.892131 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.892159 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.892185 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:11Z","lastTransitionTime":"2026-02-23T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.996384 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.996451 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.996469 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.996494 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:11 crc kubenswrapper[4735]: I0223 00:08:11.996511 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:11Z","lastTransitionTime":"2026-02-23T00:08:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.099572 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.099638 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.099656 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.099687 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.099704 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:12Z","lastTransitionTime":"2026-02-23T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.203831 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.203915 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.203933 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.203956 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.203973 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:12Z","lastTransitionTime":"2026-02-23T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.229040 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 16:47:10.871928977 +0000 UTC Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.272439 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.272439 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:08:12 crc kubenswrapper[4735]: E0223 00:08:12.272746 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:08:12 crc kubenswrapper[4735]: E0223 00:08:12.273013 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.307366 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.307410 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.307422 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.307447 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.307463 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:12Z","lastTransitionTime":"2026-02-23T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.310253 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.330199 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.353245 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.374293 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.391646 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zhj4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3765b3d0261e124aa7d128e9bdf43a6c76939b5d63f3960ab38a129bf3476c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8649q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zhj4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.410346 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.410427 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.410453 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.410486 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.410510 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:12Z","lastTransitionTime":"2026-02-23T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.411975 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zmdgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c844ff7-f40a-43c5-bf50-dae480a77428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae14a0327db69c9cd0b4cb514bee921487ce35aaeb18535d8d44bc172fcdf59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52p6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e72aa9d3da2d55c67384b2de4aaab9402eff6a464c662ca173f18affd95f4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52p6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:08:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zmdgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.429541 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bdqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b542cb9e-35cc-44d9-a850-c41887636c4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8qrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8qrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:08:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bdqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.450884 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.477587 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.499292 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.514172 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.514295 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.514319 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.514350 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.514368 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:12Z","lastTransitionTime":"2026-02-23T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.519600 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3ae04d2c5dac3e689b744efd8990e7210f627cf79b77aab8cdb477e22eedeef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.543940 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29209462-90c5-4aa1-9943-d8b15ac1b5a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23e62d470b47146565ead752e6fb5578b3951831fd10126ec5ba4752e6a42e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26428\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.569278 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.586576 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.606643 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.619036 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.619103 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.619126 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.619156 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.619176 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:12Z","lastTransitionTime":"2026-02-23T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.623934 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4gvxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b63c18f-b6b2-4d97-b542-7800b475bd4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4pzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4gvxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.650555 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66853c8a-9391-4291-b5f1-c72cb5fe23e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb388ead34e3f72990c3f6cb25e388f77d137e16df82dc8ae239f146b7154f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb388ead34e3f72990c3f6cb25e388f77d137e16df82dc8ae239f146b7154f1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T00:08:00Z\\\",\\\"message\\\":\\\" 6 for removal\\\\nI0223 00:08:00.703109 6175 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0223 00:08:00.703066 6175 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 00:08:00.703206 6175 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0223 00:08:00.703283 6175 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0223 00:08:00.703382 6175 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0223 00:08:00.703454 6175 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0223 00:08:00.703472 6175 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0223 00:08:00.703502 6175 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0223 00:08:00.703522 6175 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0223 00:08:00.703503 6175 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0223 00:08:00.703546 6175 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0223 00:08:00.703552 6175 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0223 00:08:00.703566 6175 factory.go:656] Stopping watch factory\\\\nI0223 00:08:00.703580 6175 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0223 00:08:00.703589 6175 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-59rkm_openshift-ovn-kubernetes(66853c8a-9391-4291-b5f1-c72cb5fe23e8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-59rkm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:12Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.721921 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.722023 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.722041 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.722069 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.722090 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:12Z","lastTransitionTime":"2026-02-23T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.825466 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.825535 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.825553 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.825578 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.825594 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:12Z","lastTransitionTime":"2026-02-23T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.928656 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.928721 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.928739 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.928765 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:12 crc kubenswrapper[4735]: I0223 00:08:12.928784 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:12Z","lastTransitionTime":"2026-02-23T00:08:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.033083 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.033144 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.033162 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.033186 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.033206 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:13Z","lastTransitionTime":"2026-02-23T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.137082 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.137165 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.137187 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.137210 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.137227 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:13Z","lastTransitionTime":"2026-02-23T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.230210 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 04:10:03.538491918 +0000 UTC Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.240953 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.241022 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.241042 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.241069 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.241087 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:13Z","lastTransitionTime":"2026-02-23T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.246735 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.265222 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.271967 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.271999 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:08:13 crc kubenswrapper[4735]: E0223 00:08:13.272207 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:08:13 crc kubenswrapper[4735]: E0223 00:08:13.272382 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.272983 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.296632 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.317669 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.334896 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3ae04d2c5dac3e689b744efd8990e7210f627cf79b77aab8cdb477e22eedeef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.344343 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.344408 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.344427 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.344501 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.344522 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:13Z","lastTransitionTime":"2026-02-23T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.355190 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29209462-90c5-4aa1-9943-d8b15ac1b5a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23e62d470b47146565ead752e6fb5578b3951831fd10126ec5ba4752e6a42e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26428\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.412458 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.435911 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.447041 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.447088 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.447099 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.447117 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.447129 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:13Z","lastTransitionTime":"2026-02-23T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.451251 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.465131 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.465273 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:08:13 crc kubenswrapper[4735]: E0223 00:08:13.465314 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:08:45.465287943 +0000 UTC m=+83.928833904 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.465364 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:08:13 crc kubenswrapper[4735]: E0223 00:08:13.465464 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 00:08:13 crc kubenswrapper[4735]: E0223 00:08:13.465520 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 00:08:13 crc kubenswrapper[4735]: E0223 00:08:13.465572 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 00:08:45.465546429 +0000 UTC m=+83.929092440 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 00:08:13 crc kubenswrapper[4735]: E0223 00:08:13.465603 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 00:08:45.4655908 +0000 UTC m=+83.929136811 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.467108 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4gvxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b63c18f-b6b2-4d97-b542-7800b475bd4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4pzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4gvxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.492331 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66853c8a-9391-4291-b5f1-c72cb5fe23e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://feb388ead34e3f72990c3f6cb25e388f77d137e16df82dc8ae239f146b7154f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb388ead34e3f72990c3f6cb25e388f77d137e16df82dc8ae239f146b7154f1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T00:08:00Z\\\",\\\"message\\\":\\\" 6 for removal\\\\nI0223 00:08:00.703109 6175 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0223 00:08:00.703066 6175 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 00:08:00.703206 6175 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0223 00:08:00.703283 6175 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0223 00:08:00.703382 6175 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0223 00:08:00.703454 6175 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0223 00:08:00.703472 6175 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0223 00:08:00.703502 6175 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0223 00:08:00.703522 6175 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0223 00:08:00.703503 6175 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0223 00:08:00.703546 6175 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0223 00:08:00.703552 6175 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0223 00:08:00.703566 6175 factory.go:656] Stopping watch factory\\\\nI0223 00:08:00.703580 6175 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0223 00:08:00.703589 6175 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-59rkm_openshift-ovn-kubernetes(66853c8a-9391-4291-b5f1-c72cb5fe23e8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-59rkm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.516989 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.531564 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.548251 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.549752 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.549872 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.549892 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.549951 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.549970 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:13Z","lastTransitionTime":"2026-02-23T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.562270 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.565997 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.566091 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:08:13 crc kubenswrapper[4735]: E0223 00:08:13.566259 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 00:08:13 crc kubenswrapper[4735]: E0223 00:08:13.566297 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 00:08:13 crc kubenswrapper[4735]: E0223 00:08:13.566305 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 00:08:13 crc kubenswrapper[4735]: E0223 00:08:13.566322 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 00:08:13 crc kubenswrapper[4735]: E0223 00:08:13.566332 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 00:08:13 crc kubenswrapper[4735]: E0223 00:08:13.566342 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 00:08:13 crc kubenswrapper[4735]: E0223 00:08:13.566408 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-23 00:08:45.566383546 +0000 UTC m=+84.029929557 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 00:08:13 crc kubenswrapper[4735]: E0223 00:08:13.566436 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-23 00:08:45.566423407 +0000 UTC m=+84.029969418 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.580492 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zhj4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3765b3d0261e124aa7d128e9bdf43a6c76939b5d63f3960ab38a129bf3476c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8649q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zhj4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.595532 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zmdgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c844ff7-f40a-43c5-bf50-dae480a77428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae14a0327db69c9cd0b4cb514bee921487ce35aaeb18535d8d44bc172fcdf59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52p6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e72aa9d3da2d55c67384b2de4aaab9402eff6a464c662ca173f18affd95f4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52p6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:08:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zmdgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.607748 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bdqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b542cb9e-35cc-44d9-a850-c41887636c4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8qrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8qrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:08:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bdqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:13Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.653257 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.653309 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.653326 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.653349 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.653366 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:13Z","lastTransitionTime":"2026-02-23T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.756321 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.756385 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.756401 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.756424 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.756444 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:13Z","lastTransitionTime":"2026-02-23T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.858762 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.858815 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.858831 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.859003 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.859055 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:13Z","lastTransitionTime":"2026-02-23T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.961892 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.961956 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.961975 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.961998 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:13 crc kubenswrapper[4735]: I0223 00:08:13.962019 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:13Z","lastTransitionTime":"2026-02-23T00:08:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.064519 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.064586 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.064604 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.064629 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.064647 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:14Z","lastTransitionTime":"2026-02-23T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.167813 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.167948 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.167969 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.168000 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.168018 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:14Z","lastTransitionTime":"2026-02-23T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.230769 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 20:53:21.968714872 +0000 UTC Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.272493 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.272555 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.272493 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.272693 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.272716 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:14 crc kubenswrapper[4735]: E0223 00:08:14.272735 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:08:14 crc kubenswrapper[4735]: E0223 00:08:14.272892 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.275445 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.275557 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:14Z","lastTransitionTime":"2026-02-23T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.378792 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.378882 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.378903 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.378928 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.378946 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:14Z","lastTransitionTime":"2026-02-23T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.481785 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.481835 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.481886 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.481913 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.481931 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:14Z","lastTransitionTime":"2026-02-23T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.584782 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.584844 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.584902 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.584926 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.584944 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:14Z","lastTransitionTime":"2026-02-23T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.688144 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.688200 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.688217 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.688239 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.688256 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:14Z","lastTransitionTime":"2026-02-23T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.790669 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.790718 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.790737 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.790761 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.790778 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:14Z","lastTransitionTime":"2026-02-23T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.894049 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.894119 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.894136 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.894158 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.894176 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:14Z","lastTransitionTime":"2026-02-23T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.996626 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.996680 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.996697 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.996722 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:14 crc kubenswrapper[4735]: I0223 00:08:14.996741 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:14Z","lastTransitionTime":"2026-02-23T00:08:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.099455 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.099585 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.099605 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.099628 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.099644 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:15Z","lastTransitionTime":"2026-02-23T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.202278 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.202332 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.202348 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.202372 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.202390 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:15Z","lastTransitionTime":"2026-02-23T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.231728 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 13:14:17.738354562 +0000 UTC Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.271567 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.272148 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:15 crc kubenswrapper[4735]: E0223 00:08:15.272336 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:08:15 crc kubenswrapper[4735]: E0223 00:08:15.272701 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.272755 4735 scope.go:117] "RemoveContainer" containerID="feb388ead34e3f72990c3f6cb25e388f77d137e16df82dc8ae239f146b7154f1" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.311069 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.311134 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.311154 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.311178 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.311217 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:15Z","lastTransitionTime":"2026-02-23T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.414157 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.414206 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.414222 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.414244 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.414261 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:15Z","lastTransitionTime":"2026-02-23T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.516719 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.516774 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.516790 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.516814 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.516831 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:15Z","lastTransitionTime":"2026-02-23T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.619743 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.619807 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.619827 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.619884 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.619903 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:15Z","lastTransitionTime":"2026-02-23T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.688712 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-59rkm_66853c8a-9391-4291-b5f1-c72cb5fe23e8/ovnkube-controller/1.log" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.693471 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" event={"ID":"66853c8a-9391-4291-b5f1-c72cb5fe23e8","Type":"ContainerStarted","Data":"2c0ef57c9ddaf9bb3702b7cffe2357050374ab9cd1307b52e2dd91097b45c24d"} Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.694171 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.722898 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.722962 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.722981 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.723007 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.723026 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:15Z","lastTransitionTime":"2026-02-23T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.733734 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.755602 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44e04dd6-c3d9-417b-aa26-2f605e07fc7b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0741d00d65832abe165aed347ca8633cb72d8d3395922dfa5615527f37d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6722b48f73153052807e8d89e0ef3bbada7ac84740e56c10f8767509b55ab5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73257ceeafdfe417dd887f970b9f16ddd188d8ed38699ec89189e451698539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd35aa0556235dde90cf7c6f1750c98352ccdfc64f6d4b9acd35efed020aeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd35aa0556235dde90cf7c6f1750c98352ccdfc64f6d4b9acd35efed020aeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.776146 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.798543 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.834135 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.834179 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.834189 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.834209 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.834221 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:15Z","lastTransitionTime":"2026-02-23T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.850101 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.867617 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zhj4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3765b3d0261e124aa7d128e9bdf43a6c76939b5d63f3960ab38a129bf3476c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8649q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zhj4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.885542 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zmdgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c844ff7-f40a-43c5-bf50-dae480a77428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae14a0327db69c9cd0b4cb514bee921487ce35aaeb18535d8d44bc172fcdf59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52p6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e72aa9d3da2d55c67384b2de4aaab9402eff6a464c662ca173f18affd95f4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52p6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:08:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zmdgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.897753 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bdqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b542cb9e-35cc-44d9-a850-c41887636c4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8qrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8qrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:08:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bdqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.911624 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.927920 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.936018 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.936065 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.936076 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.936096 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.936109 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:15Z","lastTransitionTime":"2026-02-23T00:08:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.939538 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.953905 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3ae04d2c5dac3e689b744efd8990e7210f627cf79b77aab8cdb477e22eedeef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.971183 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29209462-90c5-4aa1-9943-d8b15ac1b5a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23e62d470b47146565ead752e6fb5578b3951831fd10126ec5ba4752e6a42e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26428\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:15 crc kubenswrapper[4735]: I0223 00:08:15.985989 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:15Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.002572 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.020775 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.038064 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.038132 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.038149 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.038174 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.038192 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:16Z","lastTransitionTime":"2026-02-23T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.040738 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4gvxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b63c18f-b6b2-4d97-b542-7800b475bd4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4pzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4gvxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.071179 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66853c8a-9391-4291-b5f1-c72cb5fe23e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0ef57c9ddaf9bb3702b7cffe2357050374ab9cd1307b52e2dd91097b45c24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb388ead34e3f72990c3f6cb25e388f77d137e16df82dc8ae239f146b7154f1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T00:08:00Z\\\",\\\"message\\\":\\\" 6 for removal\\\\nI0223 00:08:00.703109 6175 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0223 00:08:00.703066 6175 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 00:08:00.703206 6175 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0223 00:08:00.703283 6175 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0223 00:08:00.703382 6175 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0223 00:08:00.703454 6175 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0223 00:08:00.703472 6175 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0223 00:08:00.703502 6175 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0223 00:08:00.703522 6175 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0223 00:08:00.703503 6175 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0223 00:08:00.703546 6175 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0223 00:08:00.703552 6175 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0223 00:08:00.703566 6175 factory.go:656] Stopping watch factory\\\\nI0223 00:08:00.703580 6175 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0223 00:08:00.703589 6175 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-59rkm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.140495 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.140587 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.140606 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.140629 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.140648 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:16Z","lastTransitionTime":"2026-02-23T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.232722 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 09:13:00.802018854 +0000 UTC Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.243705 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.243773 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.243792 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.243817 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.243835 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:16Z","lastTransitionTime":"2026-02-23T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.272132 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.272299 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:08:16 crc kubenswrapper[4735]: E0223 00:08:16.272478 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:08:16 crc kubenswrapper[4735]: E0223 00:08:16.272618 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.347132 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.347213 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.347238 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.347272 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.347297 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:16Z","lastTransitionTime":"2026-02-23T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.450537 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.450601 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.450618 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.450644 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.450663 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:16Z","lastTransitionTime":"2026-02-23T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.553907 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.553969 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.553986 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.554014 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.554031 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:16Z","lastTransitionTime":"2026-02-23T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.656909 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.656984 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.657001 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.657061 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.657076 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:16Z","lastTransitionTime":"2026-02-23T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.699317 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-59rkm_66853c8a-9391-4291-b5f1-c72cb5fe23e8/ovnkube-controller/2.log" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.700396 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-59rkm_66853c8a-9391-4291-b5f1-c72cb5fe23e8/ovnkube-controller/1.log" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.703780 4735 generic.go:334] "Generic (PLEG): container finished" podID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerID="2c0ef57c9ddaf9bb3702b7cffe2357050374ab9cd1307b52e2dd91097b45c24d" exitCode=1 Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.703889 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" event={"ID":"66853c8a-9391-4291-b5f1-c72cb5fe23e8","Type":"ContainerDied","Data":"2c0ef57c9ddaf9bb3702b7cffe2357050374ab9cd1307b52e2dd91097b45c24d"} Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.703959 4735 scope.go:117] "RemoveContainer" containerID="feb388ead34e3f72990c3f6cb25e388f77d137e16df82dc8ae239f146b7154f1" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.708668 4735 scope.go:117] "RemoveContainer" containerID="2c0ef57c9ddaf9bb3702b7cffe2357050374ab9cd1307b52e2dd91097b45c24d" Feb 23 00:08:16 crc kubenswrapper[4735]: E0223 00:08:16.708984 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-59rkm_openshift-ovn-kubernetes(66853c8a-9391-4291-b5f1-c72cb5fe23e8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.733489 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44e04dd6-c3d9-417b-aa26-2f605e07fc7b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0741d00d65832abe165aed347ca8633cb72d8d3395922dfa5615527f37d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6722b48f73153052807e8d89e0ef3bbada7ac84740e56c10f8767509b55ab5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73257ceeafdfe417dd887f970b9f16ddd188d8ed38699ec89189e451698539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd35aa0556235dde90cf7c6f1750c98352ccdfc64f6d4b9acd35efed020aeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd35aa0556235dde90cf7c6f1750c98352ccdfc64f6d4b9acd35efed020aeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.751731 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.759820 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.759879 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.759888 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.759902 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.759913 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:16Z","lastTransitionTime":"2026-02-23T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.785692 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.800761 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zhj4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3765b3d0261e124aa7d128e9bdf43a6c76939b5d63f3960ab38a129bf3476c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8649q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zhj4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.811706 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zmdgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c844ff7-f40a-43c5-bf50-dae480a77428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae14a0327db69c9cd0b4cb514bee921487ce35aaeb18535d8d44bc172fcdf59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52p6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e72aa9d3da2d55c67384b2de4aaab9402eff6a464c662ca173f18affd95f4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52p6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:08:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zmdgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.823843 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bdqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b542cb9e-35cc-44d9-a850-c41887636c4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8qrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8qrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:08:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bdqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.838085 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.858733 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.864353 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.864415 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.864435 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.864460 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.864480 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:16Z","lastTransitionTime":"2026-02-23T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.874495 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.887845 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3ae04d2c5dac3e689b744efd8990e7210f627cf79b77aab8cdb477e22eedeef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.906636 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29209462-90c5-4aa1-9943-d8b15ac1b5a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23e62d470b47146565ead752e6fb5578b3951831fd10126ec5ba4752e6a42e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26428\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.923302 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.935712 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.951414 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.966816 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4gvxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b63c18f-b6b2-4d97-b542-7800b475bd4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4pzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4gvxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.968084 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.968133 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.968150 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.968170 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.968183 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:16Z","lastTransitionTime":"2026-02-23T00:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:16 crc kubenswrapper[4735]: I0223 00:08:16.988749 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66853c8a-9391-4291-b5f1-c72cb5fe23e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0ef57c9ddaf9bb3702b7cffe2357050374ab9cd1307b52e2dd91097b45c24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://feb388ead34e3f72990c3f6cb25e388f77d137e16df82dc8ae239f146b7154f1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T00:08:00Z\\\",\\\"message\\\":\\\" 6 for removal\\\\nI0223 00:08:00.703109 6175 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0223 00:08:00.703066 6175 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0223 00:08:00.703206 6175 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0223 00:08:00.703283 6175 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0223 00:08:00.703382 6175 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0223 00:08:00.703454 6175 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0223 00:08:00.703472 6175 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0223 00:08:00.703502 6175 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0223 00:08:00.703522 6175 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0223 00:08:00.703503 6175 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0223 00:08:00.703546 6175 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0223 00:08:00.703552 6175 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0223 00:08:00.703566 6175 factory.go:656] Stopping watch factory\\\\nI0223 00:08:00.703580 6175 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0223 00:08:00.703589 6175 handler.go:208] Removed *v1.EgressIP ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c0ef57c9ddaf9bb3702b7cffe2357050374ab9cd1307b52e2dd91097b45c24d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T00:08:16Z\\\",\\\"message\\\":\\\"obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 00:08:16.276462 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 00:08:16.276482 6384 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0223 00:08:16.276493 6384 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0223 00:08:16.276504 6384 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 00:08:16.276531 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-26428\\\\nF0223 00:08:16.276536 6384 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotat\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-59rkm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:16Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.003822 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:17Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.022838 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:17Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.071468 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.071523 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.071541 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.071564 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.071584 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:17Z","lastTransitionTime":"2026-02-23T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.174883 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.174960 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.174978 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.175010 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.175035 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:17Z","lastTransitionTime":"2026-02-23T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.233066 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 05:19:11.876456653 +0000 UTC Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.271797 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.271810 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:08:17 crc kubenswrapper[4735]: E0223 00:08:17.272026 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:08:17 crc kubenswrapper[4735]: E0223 00:08:17.272147 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.278442 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.278546 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.278569 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.278598 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.278619 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:17Z","lastTransitionTime":"2026-02-23T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.382206 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.382308 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.382327 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.382351 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.382370 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:17Z","lastTransitionTime":"2026-02-23T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.484932 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.485048 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.485076 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.485117 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.485144 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:17Z","lastTransitionTime":"2026-02-23T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.588482 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.588548 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.588565 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.588591 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.588610 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:17Z","lastTransitionTime":"2026-02-23T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.691594 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.691685 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.691705 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.691740 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.691801 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:17Z","lastTransitionTime":"2026-02-23T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.711492 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-59rkm_66853c8a-9391-4291-b5f1-c72cb5fe23e8/ovnkube-controller/2.log" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.717239 4735 scope.go:117] "RemoveContainer" containerID="2c0ef57c9ddaf9bb3702b7cffe2357050374ab9cd1307b52e2dd91097b45c24d" Feb 23 00:08:17 crc kubenswrapper[4735]: E0223 00:08:17.717536 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-59rkm_openshift-ovn-kubernetes(66853c8a-9391-4291-b5f1-c72cb5fe23e8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.740119 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:17Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.759883 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:17Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.779548 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:17Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.795298 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.795359 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.795376 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.795406 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.795427 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:17Z","lastTransitionTime":"2026-02-23T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.801652 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4gvxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b63c18f-b6b2-4d97-b542-7800b475bd4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4pzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4gvxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:17Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.834845 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66853c8a-9391-4291-b5f1-c72cb5fe23e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0ef57c9ddaf9bb3702b7cffe2357050374ab9cd1307b52e2dd91097b45c24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c0ef57c9ddaf9bb3702b7cffe2357050374ab9cd1307b52e2dd91097b45c24d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T00:08:16Z\\\",\\\"message\\\":\\\"obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 00:08:16.276462 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 00:08:16.276482 6384 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0223 00:08:16.276493 6384 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0223 00:08:16.276504 6384 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 00:08:16.276531 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-26428\\\\nF0223 00:08:16.276536 6384 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotat\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:08:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-59rkm_openshift-ovn-kubernetes(66853c8a-9391-4291-b5f1-c72cb5fe23e8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-59rkm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:17Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.868444 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:17Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.889774 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44e04dd6-c3d9-417b-aa26-2f605e07fc7b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0741d00d65832abe165aed347ca8633cb72d8d3395922dfa5615527f37d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6722b48f73153052807e8d89e0ef3bbada7ac84740e56c10f8767509b55ab5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73257ceeafdfe417dd887f970b9f16ddd188d8ed38699ec89189e451698539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd35aa0556235dde90cf7c6f1750c98352ccdfc64f6d4b9acd35efed020aeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd35aa0556235dde90cf7c6f1750c98352ccdfc64f6d4b9acd35efed020aeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:17Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.898182 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.898278 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.898364 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.898397 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.898452 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:17Z","lastTransitionTime":"2026-02-23T00:08:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.907985 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:17Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.928558 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:17Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.952251 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:17Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.969091 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zhj4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3765b3d0261e124aa7d128e9bdf43a6c76939b5d63f3960ab38a129bf3476c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8649q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zhj4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:17Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:17 crc kubenswrapper[4735]: I0223 00:08:17.988439 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zmdgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c844ff7-f40a-43c5-bf50-dae480a77428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae14a0327db69c9cd0b4cb514bee921487ce35aaeb18535d8d44bc172fcdf59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52p6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e72aa9d3da2d55c67384b2de4aaab9402eff6a464c662ca173f18affd95f4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52p6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:08:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zmdgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:17Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.001642 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.001701 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.001726 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.001754 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.001777 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:18Z","lastTransitionTime":"2026-02-23T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.008966 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bdqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b542cb9e-35cc-44d9-a850-c41887636c4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8qrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8qrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:08:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bdqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:18Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.030015 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:18Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.050613 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:18Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.071226 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:18Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.088933 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3ae04d2c5dac3e689b744efd8990e7210f627cf79b77aab8cdb477e22eedeef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:18Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.105747 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.105807 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.105826 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.105884 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.105904 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:18Z","lastTransitionTime":"2026-02-23T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.121773 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29209462-90c5-4aa1-9943-d8b15ac1b5a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23e62d470b47146565ead752e6fb5578b3951831fd10126ec5ba4752e6a42e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26428\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:18Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.208890 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.208953 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.208973 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.209000 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.209018 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:18Z","lastTransitionTime":"2026-02-23T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.233590 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 01:20:55.218623098 +0000 UTC Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.271242 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.271390 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:08:18 crc kubenswrapper[4735]: E0223 00:08:18.271512 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:08:18 crc kubenswrapper[4735]: E0223 00:08:18.271628 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.311701 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.311765 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.311782 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.311806 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.311826 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:18Z","lastTransitionTime":"2026-02-23T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.414821 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.414919 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.414941 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.414968 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.414986 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:18Z","lastTransitionTime":"2026-02-23T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.518390 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.518437 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.518451 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.518466 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.518478 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:18Z","lastTransitionTime":"2026-02-23T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.624985 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.625033 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.625045 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.625059 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.625071 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:18Z","lastTransitionTime":"2026-02-23T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.727951 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.728341 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.728475 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.728639 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.728762 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:18Z","lastTransitionTime":"2026-02-23T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.801577 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.801675 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.801696 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.801720 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.801740 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:18Z","lastTransitionTime":"2026-02-23T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:18 crc kubenswrapper[4735]: E0223 00:08:18.824118 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc670e79-a4c0-4f94-a41e-9a217a93a98f\\\",\\\"systemUUID\\\":\\\"34aea2d1-4777-4ec2-a0dd-7ee942962cf5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:18Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.830659 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.830730 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.830749 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.830773 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.830790 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:18Z","lastTransitionTime":"2026-02-23T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:18 crc kubenswrapper[4735]: E0223 00:08:18.850955 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc670e79-a4c0-4f94-a41e-9a217a93a98f\\\",\\\"systemUUID\\\":\\\"34aea2d1-4777-4ec2-a0dd-7ee942962cf5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:18Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.856453 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.856511 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.856534 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.856566 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.856584 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:18Z","lastTransitionTime":"2026-02-23T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:18 crc kubenswrapper[4735]: E0223 00:08:18.878063 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc670e79-a4c0-4f94-a41e-9a217a93a98f\\\",\\\"systemUUID\\\":\\\"34aea2d1-4777-4ec2-a0dd-7ee942962cf5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:18Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.883390 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.883443 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.883457 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.883475 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.883489 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:18Z","lastTransitionTime":"2026-02-23T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:18 crc kubenswrapper[4735]: E0223 00:08:18.900116 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc670e79-a4c0-4f94-a41e-9a217a93a98f\\\",\\\"systemUUID\\\":\\\"34aea2d1-4777-4ec2-a0dd-7ee942962cf5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:18Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.905103 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.905299 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.905456 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.905595 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.905724 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:18Z","lastTransitionTime":"2026-02-23T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:18 crc kubenswrapper[4735]: E0223 00:08:18.926828 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc670e79-a4c0-4f94-a41e-9a217a93a98f\\\",\\\"systemUUID\\\":\\\"34aea2d1-4777-4ec2-a0dd-7ee942962cf5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:18Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:18 crc kubenswrapper[4735]: E0223 00:08:18.928049 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.930082 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.930281 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.930414 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.930674 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:18 crc kubenswrapper[4735]: I0223 00:08:18.930911 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:18Z","lastTransitionTime":"2026-02-23T00:08:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.034075 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.034133 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.034149 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.034175 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.034192 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:19Z","lastTransitionTime":"2026-02-23T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.137593 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.137716 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.137734 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.137757 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.137775 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:19Z","lastTransitionTime":"2026-02-23T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.234343 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 00:22:31.335407674 +0000 UTC Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.240872 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.240920 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.240937 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.240962 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.240981 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:19Z","lastTransitionTime":"2026-02-23T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.272079 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:19 crc kubenswrapper[4735]: E0223 00:08:19.272275 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.272090 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:08:19 crc kubenswrapper[4735]: E0223 00:08:19.272420 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.344728 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.344784 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.344804 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.344827 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.344845 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:19Z","lastTransitionTime":"2026-02-23T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.448832 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.448913 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.448933 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.448963 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.448986 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:19Z","lastTransitionTime":"2026-02-23T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.552412 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.552464 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.552486 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.552510 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.552527 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:19Z","lastTransitionTime":"2026-02-23T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.656615 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.656683 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.656728 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.656756 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.656773 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:19Z","lastTransitionTime":"2026-02-23T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.758985 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.759063 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.759080 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.759106 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.759123 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:19Z","lastTransitionTime":"2026-02-23T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.838085 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b542cb9e-35cc-44d9-a850-c41887636c4c-metrics-certs\") pod \"network-metrics-daemon-bdqfd\" (UID: \"b542cb9e-35cc-44d9-a850-c41887636c4c\") " pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:19 crc kubenswrapper[4735]: E0223 00:08:19.838263 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 00:08:19 crc kubenswrapper[4735]: E0223 00:08:19.838352 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b542cb9e-35cc-44d9-a850-c41887636c4c-metrics-certs podName:b542cb9e-35cc-44d9-a850-c41887636c4c nodeName:}" failed. No retries permitted until 2026-02-23 00:08:35.838322181 +0000 UTC m=+74.301868182 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b542cb9e-35cc-44d9-a850-c41887636c4c-metrics-certs") pod "network-metrics-daemon-bdqfd" (UID: "b542cb9e-35cc-44d9-a850-c41887636c4c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.863953 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.864011 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.864029 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.864053 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.864070 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:19Z","lastTransitionTime":"2026-02-23T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.967733 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.968646 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.968823 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.969110 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:19 crc kubenswrapper[4735]: I0223 00:08:19.969286 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:19Z","lastTransitionTime":"2026-02-23T00:08:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.072488 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.072540 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.072558 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.072581 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.072598 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:20Z","lastTransitionTime":"2026-02-23T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.175483 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.175556 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.175578 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.175609 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.175630 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:20Z","lastTransitionTime":"2026-02-23T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.235291 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 05:46:34.180274014 +0000 UTC Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.271382 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.271555 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:08:20 crc kubenswrapper[4735]: E0223 00:08:20.271592 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:08:20 crc kubenswrapper[4735]: E0223 00:08:20.272011 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.277894 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.278076 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.278146 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.278216 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.278285 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:20Z","lastTransitionTime":"2026-02-23T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.381612 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.381676 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.381694 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.381721 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.381740 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:20Z","lastTransitionTime":"2026-02-23T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.484652 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.484737 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.484749 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.484769 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.484783 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:20Z","lastTransitionTime":"2026-02-23T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.588418 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.588484 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.588502 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.588529 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.588549 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:20Z","lastTransitionTime":"2026-02-23T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.691484 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.691543 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.691560 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.691577 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.691589 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:20Z","lastTransitionTime":"2026-02-23T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.795145 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.795207 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.795248 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.795277 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.795295 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:20Z","lastTransitionTime":"2026-02-23T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.897803 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.897847 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.897899 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.897921 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:20 crc kubenswrapper[4735]: I0223 00:08:20.897939 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:20Z","lastTransitionTime":"2026-02-23T00:08:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.000652 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.000698 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.000706 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.000720 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.000729 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:21Z","lastTransitionTime":"2026-02-23T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.104101 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.104190 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.104208 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.104231 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.104246 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:21Z","lastTransitionTime":"2026-02-23T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.207426 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.207494 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.207517 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.207543 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.207568 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:21Z","lastTransitionTime":"2026-02-23T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.235811 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 07:43:43.695314469 +0000 UTC Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.271464 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.271529 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:08:21 crc kubenswrapper[4735]: E0223 00:08:21.271652 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:08:21 crc kubenswrapper[4735]: E0223 00:08:21.271815 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.310503 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.310542 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.310550 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.310563 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.310574 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:21Z","lastTransitionTime":"2026-02-23T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.413532 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.413579 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.413590 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.413607 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.413618 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:21Z","lastTransitionTime":"2026-02-23T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.516357 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.516407 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.516419 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.516439 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.516452 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:21Z","lastTransitionTime":"2026-02-23T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.619470 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.619522 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.619533 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.619550 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.619561 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:21Z","lastTransitionTime":"2026-02-23T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.724197 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.724273 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.724290 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.724316 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.724345 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:21Z","lastTransitionTime":"2026-02-23T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.827187 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.827252 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.827277 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.827305 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.827327 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:21Z","lastTransitionTime":"2026-02-23T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.930556 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.930649 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.930670 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.930693 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:21 crc kubenswrapper[4735]: I0223 00:08:21.930710 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:21Z","lastTransitionTime":"2026-02-23T00:08:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.033087 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.033152 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.033168 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.033192 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.033210 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:22Z","lastTransitionTime":"2026-02-23T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.136316 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.136419 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.136457 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.136481 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.136499 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:22Z","lastTransitionTime":"2026-02-23T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.236345 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 11:59:08.140109841 +0000 UTC Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.239144 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.239198 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.239214 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.239239 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.239256 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:22Z","lastTransitionTime":"2026-02-23T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.271200 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:08:22 crc kubenswrapper[4735]: E0223 00:08:22.271356 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.271206 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:08:22 crc kubenswrapper[4735]: E0223 00:08:22.271474 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.291669 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.313674 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.330007 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zhj4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3765b3d0261e124aa7d128e9bdf43a6c76939b5d63f3960ab38a129bf3476c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8649q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zhj4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.344463 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.344545 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.344563 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.344604 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.344618 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:22Z","lastTransitionTime":"2026-02-23T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.348896 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zmdgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c844ff7-f40a-43c5-bf50-dae480a77428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae14a0327db69c9cd0b4cb514bee921487ce35aaeb18535d8d44bc172fcdf59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52p6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e72aa9d3da2d55c67384b2de4aaab9402eff6a464c662ca173f18affd95f4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52p6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:08:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zmdgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.364130 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bdqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b542cb9e-35cc-44d9-a850-c41887636c4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8qrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8qrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:08:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bdqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.388586 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.406989 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.428445 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.446139 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3ae04d2c5dac3e689b744efd8990e7210f627cf79b77aab8cdb477e22eedeef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.448164 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.448400 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.448588 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.448776 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.449380 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:22Z","lastTransitionTime":"2026-02-23T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.466321 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29209462-90c5-4aa1-9943-d8b15ac1b5a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23e62d470b47146565ead752e6fb5578b3951831fd10126ec5ba4752e6a42e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26428\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.483528 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.499136 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.518845 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.534142 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4gvxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b63c18f-b6b2-4d97-b542-7800b475bd4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4pzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4gvxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.552152 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.552218 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.552235 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.552260 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.552279 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:22Z","lastTransitionTime":"2026-02-23T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.559951 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66853c8a-9391-4291-b5f1-c72cb5fe23e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0ef57c9ddaf9bb3702b7cffe2357050374ab9cd1307b52e2dd91097b45c24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c0ef57c9ddaf9bb3702b7cffe2357050374ab9cd1307b52e2dd91097b45c24d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T00:08:16Z\\\",\\\"message\\\":\\\"obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 00:08:16.276462 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 00:08:16.276482 6384 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0223 00:08:16.276493 6384 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0223 00:08:16.276504 6384 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 00:08:16.276531 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-26428\\\\nF0223 00:08:16.276536 6384 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotat\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:08:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-59rkm_openshift-ovn-kubernetes(66853c8a-9391-4291-b5f1-c72cb5fe23e8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-59rkm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.581330 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.596996 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44e04dd6-c3d9-417b-aa26-2f605e07fc7b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0741d00d65832abe165aed347ca8633cb72d8d3395922dfa5615527f37d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6722b48f73153052807e8d89e0ef3bbada7ac84740e56c10f8767509b55ab5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73257ceeafdfe417dd887f970b9f16ddd188d8ed38699ec89189e451698539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd35aa0556235dde90cf7c6f1750c98352ccdfc64f6d4b9acd35efed020aeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd35aa0556235dde90cf7c6f1750c98352ccdfc64f6d4b9acd35efed020aeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.612040 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:22Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.656160 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.656239 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.656266 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.656298 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.656319 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:22Z","lastTransitionTime":"2026-02-23T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.759390 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.759435 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.759444 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.759458 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.759467 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:22Z","lastTransitionTime":"2026-02-23T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.863326 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.863893 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.864154 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.864383 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.864594 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:22Z","lastTransitionTime":"2026-02-23T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.969281 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.969583 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.969594 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.969611 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:22 crc kubenswrapper[4735]: I0223 00:08:22.969622 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:22Z","lastTransitionTime":"2026-02-23T00:08:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.073014 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.073072 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.073088 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.073112 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.073130 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:23Z","lastTransitionTime":"2026-02-23T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.175333 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.175376 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.175385 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.175398 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.175407 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:23Z","lastTransitionTime":"2026-02-23T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.236634 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 10:54:36.545187657 +0000 UTC Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.271205 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.271282 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:08:23 crc kubenswrapper[4735]: E0223 00:08:23.271344 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:08:23 crc kubenswrapper[4735]: E0223 00:08:23.271508 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.280568 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.280633 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.280652 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.280683 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.280702 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:23Z","lastTransitionTime":"2026-02-23T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.383439 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.383509 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.383531 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.383558 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.383580 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:23Z","lastTransitionTime":"2026-02-23T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.486703 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.486765 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.486807 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.486841 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.486914 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:23Z","lastTransitionTime":"2026-02-23T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.589785 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.589840 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.589898 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.589922 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.589938 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:23Z","lastTransitionTime":"2026-02-23T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.693466 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.693523 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.693539 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.693563 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.693579 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:23Z","lastTransitionTime":"2026-02-23T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.797324 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.797370 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.797382 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.797398 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.797409 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:23Z","lastTransitionTime":"2026-02-23T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.901007 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.901072 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.901090 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.901115 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:23 crc kubenswrapper[4735]: I0223 00:08:23.901134 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:23Z","lastTransitionTime":"2026-02-23T00:08:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.005243 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.005298 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.005315 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.005344 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.005361 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:24Z","lastTransitionTime":"2026-02-23T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.109105 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.109156 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.109177 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.109201 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.109219 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:24Z","lastTransitionTime":"2026-02-23T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.212581 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.212645 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.212663 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.212689 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.212710 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:24Z","lastTransitionTime":"2026-02-23T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.236949 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 10:42:05.856083097 +0000 UTC Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.272042 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:08:24 crc kubenswrapper[4735]: E0223 00:08:24.272183 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.272507 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:08:24 crc kubenswrapper[4735]: E0223 00:08:24.272610 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.315428 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.315798 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.315888 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.315962 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.316019 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:24Z","lastTransitionTime":"2026-02-23T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.418995 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.419035 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.419048 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.419067 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.419079 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:24Z","lastTransitionTime":"2026-02-23T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.522240 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.522294 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.522307 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.522327 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.522340 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:24Z","lastTransitionTime":"2026-02-23T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.624871 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.624922 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.624937 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.624956 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.624969 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:24Z","lastTransitionTime":"2026-02-23T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.727972 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.728048 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.728062 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.728084 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.728095 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:24Z","lastTransitionTime":"2026-02-23T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.831902 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.831969 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.831988 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.832014 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.832037 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:24Z","lastTransitionTime":"2026-02-23T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.941264 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.942022 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.942124 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.942222 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:24 crc kubenswrapper[4735]: I0223 00:08:24.942379 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:24Z","lastTransitionTime":"2026-02-23T00:08:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.045548 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.045620 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.045640 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.045669 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.045690 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:25Z","lastTransitionTime":"2026-02-23T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.151712 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.152455 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.152524 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.152607 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.152680 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:25Z","lastTransitionTime":"2026-02-23T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.237495 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 19:07:01.145274775 +0000 UTC Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.256346 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.256386 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.256399 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.256421 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.256434 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:25Z","lastTransitionTime":"2026-02-23T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.272060 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.272136 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:08:25 crc kubenswrapper[4735]: E0223 00:08:25.272247 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:08:25 crc kubenswrapper[4735]: E0223 00:08:25.272332 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.360014 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.360061 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.360075 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.360096 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.360110 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:25Z","lastTransitionTime":"2026-02-23T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.463793 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.464342 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.464445 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.464544 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.464631 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:25Z","lastTransitionTime":"2026-02-23T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.568105 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.568530 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.568764 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.569027 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.569207 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:25Z","lastTransitionTime":"2026-02-23T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.672248 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.672312 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.672332 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.672357 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.672375 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:25Z","lastTransitionTime":"2026-02-23T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.774951 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.775002 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.775021 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.775049 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.775066 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:25Z","lastTransitionTime":"2026-02-23T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.877663 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.877712 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.877727 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.877744 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.877756 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:25Z","lastTransitionTime":"2026-02-23T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.980778 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.980827 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.980844 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.980899 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:25 crc kubenswrapper[4735]: I0223 00:08:25.980915 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:25Z","lastTransitionTime":"2026-02-23T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.083774 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.083880 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.083898 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.083916 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.083939 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:26Z","lastTransitionTime":"2026-02-23T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.187412 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.187468 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.187485 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.187508 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.187524 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:26Z","lastTransitionTime":"2026-02-23T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.238042 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 03:31:31.95127824 +0000 UTC Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.271535 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:08:26 crc kubenswrapper[4735]: E0223 00:08:26.271657 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.271737 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:08:26 crc kubenswrapper[4735]: E0223 00:08:26.271991 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.289505 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.289541 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.289552 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.289563 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.289573 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:26Z","lastTransitionTime":"2026-02-23T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.391420 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.391459 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.391471 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.391485 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.391493 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:26Z","lastTransitionTime":"2026-02-23T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.494975 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.495017 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.495026 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.495042 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.495052 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:26Z","lastTransitionTime":"2026-02-23T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.598306 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.598692 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.598777 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.598871 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.598956 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:26Z","lastTransitionTime":"2026-02-23T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.701636 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.701709 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.701728 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.701754 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.701776 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:26Z","lastTransitionTime":"2026-02-23T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.804767 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.804839 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.804886 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.804910 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.804928 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:26Z","lastTransitionTime":"2026-02-23T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.907396 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.907463 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.907480 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.907504 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:26 crc kubenswrapper[4735]: I0223 00:08:26.907521 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:26Z","lastTransitionTime":"2026-02-23T00:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.010913 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.010960 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.010978 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.010997 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.011009 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:27Z","lastTransitionTime":"2026-02-23T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.113842 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.113930 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.113948 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.113972 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.113988 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:27Z","lastTransitionTime":"2026-02-23T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.217084 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.217149 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.217167 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.217191 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.217207 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:27Z","lastTransitionTime":"2026-02-23T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.239057 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 21:43:00.596407035 +0000 UTC Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.271445 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:08:27 crc kubenswrapper[4735]: E0223 00:08:27.271553 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.271453 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:27 crc kubenswrapper[4735]: E0223 00:08:27.271760 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.320397 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.320466 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.320486 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.320514 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.320529 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:27Z","lastTransitionTime":"2026-02-23T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.423486 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.423566 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.423593 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.423631 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.423655 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:27Z","lastTransitionTime":"2026-02-23T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.526830 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.526914 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.526926 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.526957 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.526970 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:27Z","lastTransitionTime":"2026-02-23T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.630429 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.630486 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.630500 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.630521 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.630534 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:27Z","lastTransitionTime":"2026-02-23T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.733131 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.733166 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.733178 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.733193 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:27 crc kubenswrapper[4735]: I0223 00:08:27.733206 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:27Z","lastTransitionTime":"2026-02-23T00:08:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:28 crc kubenswrapper[4735]: I0223 00:08:28.558492 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 22:56:00.032868572 +0000 UTC Feb 23 00:08:28 crc kubenswrapper[4735]: I0223 00:08:28.559480 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:08:28 crc kubenswrapper[4735]: I0223 00:08:28.561844 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:08:28 crc kubenswrapper[4735]: E0223 00:08:28.560448 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:08:28 crc kubenswrapper[4735]: E0223 00:08:28.562499 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:08:28 crc kubenswrapper[4735]: I0223 00:08:28.568189 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:28 crc kubenswrapper[4735]: I0223 00:08:28.568260 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:28 crc kubenswrapper[4735]: I0223 00:08:28.568279 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:28 crc kubenswrapper[4735]: I0223 00:08:28.568306 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:28 crc kubenswrapper[4735]: I0223 00:08:28.568327 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:28Z","lastTransitionTime":"2026-02-23T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:28 crc kubenswrapper[4735]: I0223 00:08:28.670628 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:28 crc kubenswrapper[4735]: I0223 00:08:28.670667 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:28 crc kubenswrapper[4735]: I0223 00:08:28.670678 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:28 crc kubenswrapper[4735]: I0223 00:08:28.670695 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:28 crc kubenswrapper[4735]: I0223 00:08:28.670731 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:28Z","lastTransitionTime":"2026-02-23T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:28 crc kubenswrapper[4735]: I0223 00:08:28.773423 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:28 crc kubenswrapper[4735]: I0223 00:08:28.773472 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:28 crc kubenswrapper[4735]: I0223 00:08:28.773484 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:28 crc kubenswrapper[4735]: I0223 00:08:28.773499 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:28 crc kubenswrapper[4735]: I0223 00:08:28.773511 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:28Z","lastTransitionTime":"2026-02-23T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:28 crc kubenswrapper[4735]: I0223 00:08:28.876032 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:28 crc kubenswrapper[4735]: I0223 00:08:28.876088 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:28 crc kubenswrapper[4735]: I0223 00:08:28.876106 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:28 crc kubenswrapper[4735]: I0223 00:08:28.876131 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:28 crc kubenswrapper[4735]: I0223 00:08:28.876152 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:28Z","lastTransitionTime":"2026-02-23T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:28 crc kubenswrapper[4735]: I0223 00:08:28.979136 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:28 crc kubenswrapper[4735]: I0223 00:08:28.979194 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:28 crc kubenswrapper[4735]: I0223 00:08:28.979202 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:28 crc kubenswrapper[4735]: I0223 00:08:28.979214 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:28 crc kubenswrapper[4735]: I0223 00:08:28.979224 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:28Z","lastTransitionTime":"2026-02-23T00:08:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.028086 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.028141 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.028160 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.028183 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.028201 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:29Z","lastTransitionTime":"2026-02-23T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:29 crc kubenswrapper[4735]: E0223 00:08:29.049523 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc670e79-a4c0-4f94-a41e-9a217a93a98f\\\",\\\"systemUUID\\\":\\\"34aea2d1-4777-4ec2-a0dd-7ee942962cf5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:29Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.054827 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.054912 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.054929 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.054953 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.054972 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:29Z","lastTransitionTime":"2026-02-23T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:29 crc kubenswrapper[4735]: E0223 00:08:29.071745 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc670e79-a4c0-4f94-a41e-9a217a93a98f\\\",\\\"systemUUID\\\":\\\"34aea2d1-4777-4ec2-a0dd-7ee942962cf5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:29Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.076115 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.076153 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.076171 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.076192 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.076208 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:29Z","lastTransitionTime":"2026-02-23T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:29 crc kubenswrapper[4735]: E0223 00:08:29.094040 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc670e79-a4c0-4f94-a41e-9a217a93a98f\\\",\\\"systemUUID\\\":\\\"34aea2d1-4777-4ec2-a0dd-7ee942962cf5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:29Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.098517 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.098567 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.098583 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.098607 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.098624 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:29Z","lastTransitionTime":"2026-02-23T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:29 crc kubenswrapper[4735]: E0223 00:08:29.115752 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc670e79-a4c0-4f94-a41e-9a217a93a98f\\\",\\\"systemUUID\\\":\\\"34aea2d1-4777-4ec2-a0dd-7ee942962cf5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:29Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.128773 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.128829 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.128873 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.128899 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.128916 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:29Z","lastTransitionTime":"2026-02-23T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:29 crc kubenswrapper[4735]: E0223 00:08:29.147800 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc670e79-a4c0-4f94-a41e-9a217a93a98f\\\",\\\"systemUUID\\\":\\\"34aea2d1-4777-4ec2-a0dd-7ee942962cf5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:29Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:29 crc kubenswrapper[4735]: E0223 00:08:29.148060 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.150163 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.150208 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.150225 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.150245 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.150259 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:29Z","lastTransitionTime":"2026-02-23T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.252153 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.252234 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.252251 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.252271 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.252320 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:29Z","lastTransitionTime":"2026-02-23T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.271891 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.271940 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:08:29 crc kubenswrapper[4735]: E0223 00:08:29.272016 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:08:29 crc kubenswrapper[4735]: E0223 00:08:29.272223 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.354389 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.354473 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.354484 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.354500 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.354510 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:29Z","lastTransitionTime":"2026-02-23T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.456418 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.456461 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.456472 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.456489 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.456499 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:29Z","lastTransitionTime":"2026-02-23T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.558835 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 19:36:54.312342989 +0000 UTC Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.559404 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.559443 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.559457 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.559472 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.559483 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:29Z","lastTransitionTime":"2026-02-23T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.662694 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.662794 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.662809 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.662832 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.662842 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:29Z","lastTransitionTime":"2026-02-23T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.766812 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.766899 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.766909 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.766925 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.766939 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:29Z","lastTransitionTime":"2026-02-23T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.869913 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.870004 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.870023 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.870049 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.870066 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:29Z","lastTransitionTime":"2026-02-23T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.973170 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.973218 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.973236 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.973270 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:29 crc kubenswrapper[4735]: I0223 00:08:29.973286 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:29Z","lastTransitionTime":"2026-02-23T00:08:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.076444 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.076489 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.076500 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.076517 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.076528 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:30Z","lastTransitionTime":"2026-02-23T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.179526 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.179593 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.179604 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.179621 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.179633 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:30Z","lastTransitionTime":"2026-02-23T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.273052 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.273192 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:08:30 crc kubenswrapper[4735]: E0223 00:08:30.273345 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:08:30 crc kubenswrapper[4735]: E0223 00:08:30.273840 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.281192 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.281229 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.281240 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.281256 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.281267 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:30Z","lastTransitionTime":"2026-02-23T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.384566 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.384619 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.384632 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.384649 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.384662 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:30Z","lastTransitionTime":"2026-02-23T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.487440 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.487510 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.487528 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.487557 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.487575 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:30Z","lastTransitionTime":"2026-02-23T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.559946 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 15:56:57.719444569 +0000 UTC Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.589954 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.590021 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.590039 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.590426 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.590483 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:30Z","lastTransitionTime":"2026-02-23T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.693931 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.693987 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.694005 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.694027 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.694046 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:30Z","lastTransitionTime":"2026-02-23T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.796550 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.796661 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.796735 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.796766 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.796789 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:30Z","lastTransitionTime":"2026-02-23T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.899306 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.899358 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.899397 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.899425 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:30 crc kubenswrapper[4735]: I0223 00:08:30.899457 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:30Z","lastTransitionTime":"2026-02-23T00:08:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.002131 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.002190 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.002211 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.002251 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.002273 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:31Z","lastTransitionTime":"2026-02-23T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.104895 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.104974 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.104999 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.105028 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.105048 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:31Z","lastTransitionTime":"2026-02-23T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.207942 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.207992 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.208004 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.208023 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.208036 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:31Z","lastTransitionTime":"2026-02-23T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.271484 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.271528 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:08:31 crc kubenswrapper[4735]: E0223 00:08:31.272136 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.272189 4735 scope.go:117] "RemoveContainer" containerID="2c0ef57c9ddaf9bb3702b7cffe2357050374ab9cd1307b52e2dd91097b45c24d" Feb 23 00:08:31 crc kubenswrapper[4735]: E0223 00:08:31.272301 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:08:31 crc kubenswrapper[4735]: E0223 00:08:31.272462 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-59rkm_openshift-ovn-kubernetes(66853c8a-9391-4291-b5f1-c72cb5fe23e8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.284636 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.310887 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.310928 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.310940 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.310957 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.310970 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:31Z","lastTransitionTime":"2026-02-23T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.413053 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.413120 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.413134 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.413178 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.413192 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:31Z","lastTransitionTime":"2026-02-23T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.516082 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.516122 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.516134 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.516150 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.516161 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:31Z","lastTransitionTime":"2026-02-23T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.560320 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 17:06:55.071912935 +0000 UTC Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.618574 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.618626 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.618638 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.618654 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.618666 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:31Z","lastTransitionTime":"2026-02-23T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.721345 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.721390 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.721401 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.721417 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.721426 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:31Z","lastTransitionTime":"2026-02-23T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.823791 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.823840 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.823872 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.823891 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.823903 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:31Z","lastTransitionTime":"2026-02-23T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.926240 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.926272 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.926280 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.926293 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:31 crc kubenswrapper[4735]: I0223 00:08:31.926303 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:31Z","lastTransitionTime":"2026-02-23T00:08:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.029285 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.029345 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.029363 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.029387 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.029452 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:32Z","lastTransitionTime":"2026-02-23T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.132297 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.132362 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.132383 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.132413 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.132441 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:32Z","lastTransitionTime":"2026-02-23T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.234383 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.234444 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.234462 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.234485 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.234506 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:32Z","lastTransitionTime":"2026-02-23T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.271738 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.271934 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:08:32 crc kubenswrapper[4735]: E0223 00:08:32.272046 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:08:32 crc kubenswrapper[4735]: E0223 00:08:32.272299 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.283155 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:32Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.298171 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:32Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.310544 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zhj4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3765b3d0261e124aa7d128e9bdf43a6c76939b5d63f3960ab38a129bf3476c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8649q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zhj4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:32Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.324413 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zmdgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c844ff7-f40a-43c5-bf50-dae480a77428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae14a0327db69c9cd0b4cb514bee921487ce35aaeb18535d8d44bc172fcdf59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52p6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e72aa9d3da2d55c67384b2de4aaab9402eff6a464c662ca173f18affd95f4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52p6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:08:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zmdgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:32Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.333948 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bdqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b542cb9e-35cc-44d9-a850-c41887636c4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8qrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8qrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:08:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bdqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:32Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.336771 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.336827 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.336844 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.336892 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.336910 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:32Z","lastTransitionTime":"2026-02-23T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.344832 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:32Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.356754 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:32Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.370973 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:32Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.384960 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3ae04d2c5dac3e689b744efd8990e7210f627cf79b77aab8cdb477e22eedeef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:32Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.400684 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29209462-90c5-4aa1-9943-d8b15ac1b5a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23e62d470b47146565ead752e6fb5578b3951831fd10126ec5ba4752e6a42e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26428\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:32Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.418385 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:32Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.435102 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:32Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.439323 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.439394 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.439415 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.439445 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.439469 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:32Z","lastTransitionTime":"2026-02-23T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.450892 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:32Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.465089 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4gvxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b63c18f-b6b2-4d97-b542-7800b475bd4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4pzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4gvxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:32Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.487370 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66853c8a-9391-4291-b5f1-c72cb5fe23e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0ef57c9ddaf9bb3702b7cffe2357050374ab9cd1307b52e2dd91097b45c24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c0ef57c9ddaf9bb3702b7cffe2357050374ab9cd1307b52e2dd91097b45c24d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T00:08:16Z\\\",\\\"message\\\":\\\"obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 00:08:16.276462 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 00:08:16.276482 6384 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0223 00:08:16.276493 6384 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0223 00:08:16.276504 6384 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 00:08:16.276531 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-26428\\\\nF0223 00:08:16.276536 6384 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotat\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:08:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-59rkm_openshift-ovn-kubernetes(66853c8a-9391-4291-b5f1-c72cb5fe23e8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-59rkm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:32Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.498801 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb9c43a6-6806-484e-aaf0-8dd620528fad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8717b726bbca97df0f82ec42ac1d3c84d9fa009f79ed5dbd4b4a0bc2ae54c896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ab7a6b4768fded1fc53121addf4b591e68d18801561691f449f38565c6e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ab7a6b4768fded1fc53121addf4b591e68d18801561691f449f38565c6e2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:32Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.521213 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:32Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.534951 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44e04dd6-c3d9-417b-aa26-2f605e07fc7b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0741d00d65832abe165aed347ca8633cb72d8d3395922dfa5615527f37d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6722b48f73153052807e8d89e0ef3bbada7ac84740e56c10f8767509b55ab5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73257ceeafdfe417dd887f970b9f16ddd188d8ed38699ec89189e451698539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd35aa0556235dde90cf7c6f1750c98352ccdfc64f6d4b9acd35efed020aeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd35aa0556235dde90cf7c6f1750c98352ccdfc64f6d4b9acd35efed020aeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:32Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.542731 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.542780 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.542791 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.542812 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.542828 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:32Z","lastTransitionTime":"2026-02-23T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.546874 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:32Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.561390 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 17:21:47.514024721 +0000 UTC Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.646293 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.646360 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.646375 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.646399 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.646413 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:32Z","lastTransitionTime":"2026-02-23T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.749385 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.749433 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.749444 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.749464 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.749477 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:32Z","lastTransitionTime":"2026-02-23T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.852229 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.852313 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.852326 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.852346 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.852358 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:32Z","lastTransitionTime":"2026-02-23T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.954758 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.954813 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.954827 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.954845 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:32 crc kubenswrapper[4735]: I0223 00:08:32.954873 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:32Z","lastTransitionTime":"2026-02-23T00:08:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.057617 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.057695 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.057714 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.057833 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.057884 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:33Z","lastTransitionTime":"2026-02-23T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.160407 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.160469 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.160486 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.160512 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.160528 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:33Z","lastTransitionTime":"2026-02-23T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.263243 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.263295 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.263308 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.263329 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.263341 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:33Z","lastTransitionTime":"2026-02-23T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.271549 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.271548 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:33 crc kubenswrapper[4735]: E0223 00:08:33.271692 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:08:33 crc kubenswrapper[4735]: E0223 00:08:33.271831 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.365100 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.365201 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.365226 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.365264 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.365291 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:33Z","lastTransitionTime":"2026-02-23T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.467597 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.467645 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.467655 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.467670 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.467679 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:33Z","lastTransitionTime":"2026-02-23T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.561607 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 14:28:32.357680738 +0000 UTC Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.570577 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.570622 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.570636 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.570653 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.570663 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:33Z","lastTransitionTime":"2026-02-23T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.673288 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.673339 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.673356 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.673381 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.673400 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:33Z","lastTransitionTime":"2026-02-23T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.776654 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.776697 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.776709 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.776726 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.776741 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:33Z","lastTransitionTime":"2026-02-23T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.879443 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.879477 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.879486 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.879501 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.879511 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:33Z","lastTransitionTime":"2026-02-23T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.981887 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.981927 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.981936 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.981951 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:33 crc kubenswrapper[4735]: I0223 00:08:33.981962 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:33Z","lastTransitionTime":"2026-02-23T00:08:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.084324 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.084382 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.084401 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.084425 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.084444 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:34Z","lastTransitionTime":"2026-02-23T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.187795 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.187844 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.187890 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.187915 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.187931 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:34Z","lastTransitionTime":"2026-02-23T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.271438 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:08:34 crc kubenswrapper[4735]: E0223 00:08:34.271610 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.271876 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:08:34 crc kubenswrapper[4735]: E0223 00:08:34.272010 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.290711 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.290915 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.291275 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.291375 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.291466 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:34Z","lastTransitionTime":"2026-02-23T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.394291 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.394345 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.394364 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.394388 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.394405 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:34Z","lastTransitionTime":"2026-02-23T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.497099 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.497161 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.497186 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.497216 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.497240 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:34Z","lastTransitionTime":"2026-02-23T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.562524 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 16:35:45.630505119 +0000 UTC Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.599613 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.599669 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.599681 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.599699 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.599714 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:34Z","lastTransitionTime":"2026-02-23T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.702452 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.702519 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.702542 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.702570 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.702592 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:34Z","lastTransitionTime":"2026-02-23T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.804387 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.804444 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.804461 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.804483 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.804499 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:34Z","lastTransitionTime":"2026-02-23T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.906820 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.906881 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.906893 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.906915 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:34 crc kubenswrapper[4735]: I0223 00:08:34.906927 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:34Z","lastTransitionTime":"2026-02-23T00:08:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.009090 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.009147 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.009165 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.009189 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.009206 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:35Z","lastTransitionTime":"2026-02-23T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.110998 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.111056 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.111074 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.111097 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.111118 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:35Z","lastTransitionTime":"2026-02-23T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.213878 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.213935 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.213953 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.213976 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.213993 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:35Z","lastTransitionTime":"2026-02-23T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.271685 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.271693 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:35 crc kubenswrapper[4735]: E0223 00:08:35.271884 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:08:35 crc kubenswrapper[4735]: E0223 00:08:35.271979 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.316482 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.316548 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.316561 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.316578 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.316615 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:35Z","lastTransitionTime":"2026-02-23T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.419010 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.419049 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.419058 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.419072 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.419080 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:35Z","lastTransitionTime":"2026-02-23T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.520977 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.521023 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.521036 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.521052 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.521063 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:35Z","lastTransitionTime":"2026-02-23T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.563140 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 06:21:07.776278251 +0000 UTC Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.623153 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.623246 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.623294 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.623321 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.623340 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:35Z","lastTransitionTime":"2026-02-23T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.725996 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.726033 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.726045 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.726059 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.726070 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:35Z","lastTransitionTime":"2026-02-23T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.829312 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.829372 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.829390 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.829416 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.829433 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:35Z","lastTransitionTime":"2026-02-23T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.924555 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b542cb9e-35cc-44d9-a850-c41887636c4c-metrics-certs\") pod \"network-metrics-daemon-bdqfd\" (UID: \"b542cb9e-35cc-44d9-a850-c41887636c4c\") " pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:35 crc kubenswrapper[4735]: E0223 00:08:35.924747 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 00:08:35 crc kubenswrapper[4735]: E0223 00:08:35.924896 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b542cb9e-35cc-44d9-a850-c41887636c4c-metrics-certs podName:b542cb9e-35cc-44d9-a850-c41887636c4c nodeName:}" failed. No retries permitted until 2026-02-23 00:09:07.924830871 +0000 UTC m=+106.388376882 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b542cb9e-35cc-44d9-a850-c41887636c4c-metrics-certs") pod "network-metrics-daemon-bdqfd" (UID: "b542cb9e-35cc-44d9-a850-c41887636c4c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.932267 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.932322 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.932340 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.932366 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:35 crc kubenswrapper[4735]: I0223 00:08:35.932384 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:35Z","lastTransitionTime":"2026-02-23T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.035778 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.035842 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.035893 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.035928 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.035947 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:36Z","lastTransitionTime":"2026-02-23T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.138967 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.139022 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.139030 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.139049 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.139061 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:36Z","lastTransitionTime":"2026-02-23T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.242921 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.242998 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.243017 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.243042 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.243061 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:36Z","lastTransitionTime":"2026-02-23T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.271219 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.271301 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:08:36 crc kubenswrapper[4735]: E0223 00:08:36.271413 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:08:36 crc kubenswrapper[4735]: E0223 00:08:36.271571 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.346281 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.346341 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.346358 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.346384 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.346401 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:36Z","lastTransitionTime":"2026-02-23T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.449818 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.449912 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.449932 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.450338 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.450397 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:36Z","lastTransitionTime":"2026-02-23T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.552997 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.553097 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.553122 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.553685 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.554007 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:36Z","lastTransitionTime":"2026-02-23T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.564165 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 22:28:08.163801081 +0000 UTC Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.657031 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.657088 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.657104 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.657129 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.657145 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:36Z","lastTransitionTime":"2026-02-23T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.760070 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.760128 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.760147 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.760169 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.760186 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:36Z","lastTransitionTime":"2026-02-23T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.863663 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.863752 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.863770 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.863819 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.863837 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:36Z","lastTransitionTime":"2026-02-23T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.966745 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.966800 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.966816 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.966840 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:36 crc kubenswrapper[4735]: I0223 00:08:36.966891 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:36Z","lastTransitionTime":"2026-02-23T00:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.070145 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.070222 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.070247 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.070277 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.070298 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:37Z","lastTransitionTime":"2026-02-23T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.173658 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.173751 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.173767 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.173788 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.173805 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:37Z","lastTransitionTime":"2026-02-23T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.272013 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:37 crc kubenswrapper[4735]: E0223 00:08:37.272217 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.272050 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:08:37 crc kubenswrapper[4735]: E0223 00:08:37.272653 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.277053 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.277111 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.277129 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.277190 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.277394 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:37Z","lastTransitionTime":"2026-02-23T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.380566 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.380639 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.380662 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.380687 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.380705 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:37Z","lastTransitionTime":"2026-02-23T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.483812 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.483904 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.483923 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.483944 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.483961 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:37Z","lastTransitionTime":"2026-02-23T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.564971 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 18:50:07.088117061 +0000 UTC Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.586328 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.586390 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.586408 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.586434 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.586451 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:37Z","lastTransitionTime":"2026-02-23T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.595748 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4gvxr_5b63c18f-b6b2-4d97-b542-7800b475bd4c/kube-multus/0.log" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.596219 4735 generic.go:334] "Generic (PLEG): container finished" podID="5b63c18f-b6b2-4d97-b542-7800b475bd4c" containerID="33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38" exitCode=1 Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.596329 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4gvxr" event={"ID":"5b63c18f-b6b2-4d97-b542-7800b475bd4c","Type":"ContainerDied","Data":"33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38"} Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.597305 4735 scope.go:117] "RemoveContainer" containerID="33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.613960 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb9c43a6-6806-484e-aaf0-8dd620528fad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8717b726bbca97df0f82ec42ac1d3c84d9fa009f79ed5dbd4b4a0bc2ae54c896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ab7a6b4768fded1fc53121addf4b591e68d18801561691f449f38565c6e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ab7a6b4768fded1fc53121addf4b591e68d18801561691f449f38565c6e2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:37Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.648604 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:37Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.671001 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44e04dd6-c3d9-417b-aa26-2f605e07fc7b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0741d00d65832abe165aed347ca8633cb72d8d3395922dfa5615527f37d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6722b48f73153052807e8d89e0ef3bbada7ac84740e56c10f8767509b55ab5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73257ceeafdfe417dd887f970b9f16ddd188d8ed38699ec89189e451698539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd35aa0556235dde90cf7c6f1750c98352ccdfc64f6d4b9acd35efed020aeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd35aa0556235dde90cf7c6f1750c98352ccdfc64f6d4b9acd35efed020aeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:37Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.690211 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:37Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.691261 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.691501 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.691681 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.691844 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.692061 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:37Z","lastTransitionTime":"2026-02-23T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.708339 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:37Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.733271 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:37Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.749651 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zhj4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3765b3d0261e124aa7d128e9bdf43a6c76939b5d63f3960ab38a129bf3476c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8649q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zhj4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:37Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.770172 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zmdgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c844ff7-f40a-43c5-bf50-dae480a77428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae14a0327db69c9cd0b4cb514bee921487ce35aaeb18535d8d44bc172fcdf59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52p6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e72aa9d3da2d55c67384b2de4aaab9402eff6a464c662ca173f18affd95f4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52p6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:08:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zmdgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:37Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.782949 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bdqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b542cb9e-35cc-44d9-a850-c41887636c4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8qrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8qrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:08:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bdqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:37Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.795161 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.795211 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.795229 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.795253 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.795272 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:37Z","lastTransitionTime":"2026-02-23T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.808040 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:37Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.828840 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:37Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.847041 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:37Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.863614 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3ae04d2c5dac3e689b744efd8990e7210f627cf79b77aab8cdb477e22eedeef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:37Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.886190 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29209462-90c5-4aa1-9943-d8b15ac1b5a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23e62d470b47146565ead752e6fb5578b3951831fd10126ec5ba4752e6a42e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26428\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:37Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.897974 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.898056 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.898080 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.898111 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.898131 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:37Z","lastTransitionTime":"2026-02-23T00:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.907962 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:37Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.927213 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:37Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.943949 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:37Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.965131 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4gvxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b63c18f-b6b2-4d97-b542-7800b475bd4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T00:08:36Z\\\",\\\"message\\\":\\\"2026-02-23T00:07:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f4be492d-9475-498b-a2ce-b93560b9971a\\\\n2026-02-23T00:07:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f4be492d-9475-498b-a2ce-b93560b9971a to /host/opt/cni/bin/\\\\n2026-02-23T00:07:51Z [verbose] multus-daemon started\\\\n2026-02-23T00:07:51Z [verbose] Readiness Indicator file check\\\\n2026-02-23T00:08:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4pzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4gvxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:37Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:37 crc kubenswrapper[4735]: I0223 00:08:37.999006 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66853c8a-9391-4291-b5f1-c72cb5fe23e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0ef57c9ddaf9bb3702b7cffe2357050374ab9cd1307b52e2dd91097b45c24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c0ef57c9ddaf9bb3702b7cffe2357050374ab9cd1307b52e2dd91097b45c24d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T00:08:16Z\\\",\\\"message\\\":\\\"obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 00:08:16.276462 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 00:08:16.276482 6384 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0223 00:08:16.276493 6384 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0223 00:08:16.276504 6384 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 00:08:16.276531 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-26428\\\\nF0223 00:08:16.276536 6384 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotat\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:08:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-59rkm_openshift-ovn-kubernetes(66853c8a-9391-4291-b5f1-c72cb5fe23e8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-59rkm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:37Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.000704 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.000810 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.000832 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.000893 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.000916 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:38Z","lastTransitionTime":"2026-02-23T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.103435 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.103483 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.103495 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.103513 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.103525 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:38Z","lastTransitionTime":"2026-02-23T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.206400 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.206477 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.206495 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.206519 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.206539 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:38Z","lastTransitionTime":"2026-02-23T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.271445 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.271523 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:08:38 crc kubenswrapper[4735]: E0223 00:08:38.271639 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:08:38 crc kubenswrapper[4735]: E0223 00:08:38.271745 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.309899 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.310213 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.310320 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.310421 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.310519 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:38Z","lastTransitionTime":"2026-02-23T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.413659 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.413735 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.413758 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.413788 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.413811 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:38Z","lastTransitionTime":"2026-02-23T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.517083 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.517159 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.517184 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.517218 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.517243 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:38Z","lastTransitionTime":"2026-02-23T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.565342 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 12:57:18.269095239 +0000 UTC Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.603440 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4gvxr_5b63c18f-b6b2-4d97-b542-7800b475bd4c/kube-multus/0.log" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.603523 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4gvxr" event={"ID":"5b63c18f-b6b2-4d97-b542-7800b475bd4c","Type":"ContainerStarted","Data":"5cd4fa9902eb9e1566216eb36f85b29b30e6a4a3f687b029c77356de5620c5e0"} Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.620372 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.620422 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.620438 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.620460 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.620476 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:38Z","lastTransitionTime":"2026-02-23T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.622117 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3ae04d2c5dac3e689b744efd8990e7210f627cf79b77aab8cdb477e22eedeef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:38Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.646881 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29209462-90c5-4aa1-9943-d8b15ac1b5a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23e62d470b47146565ead752e6fb5578b3951831fd10126ec5ba4752e6a42e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26428\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:38Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.667189 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:38Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.689778 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:38Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.707644 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:38Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.722968 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.723017 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.723036 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.723056 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.723072 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:38Z","lastTransitionTime":"2026-02-23T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.729929 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4gvxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b63c18f-b6b2-4d97-b542-7800b475bd4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cd4fa9902eb9e1566216eb36f85b29b30e6a4a3f687b029c77356de5620c5e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T00:08:36Z\\\",\\\"message\\\":\\\"2026-02-23T00:07:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f4be492d-9475-498b-a2ce-b93560b9971a\\\\n2026-02-23T00:07:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f4be492d-9475-498b-a2ce-b93560b9971a to /host/opt/cni/bin/\\\\n2026-02-23T00:07:51Z [verbose] multus-daemon started\\\\n2026-02-23T00:07:51Z [verbose] Readiness Indicator file check\\\\n2026-02-23T00:08:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4pzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4gvxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:38Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.755674 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66853c8a-9391-4291-b5f1-c72cb5fe23e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0ef57c9ddaf9bb3702b7cffe2357050374ab9cd1307b52e2dd91097b45c24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c0ef57c9ddaf9bb3702b7cffe2357050374ab9cd1307b52e2dd91097b45c24d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T00:08:16Z\\\",\\\"message\\\":\\\"obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 00:08:16.276462 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 00:08:16.276482 6384 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0223 00:08:16.276493 6384 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0223 00:08:16.276504 6384 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 00:08:16.276531 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-26428\\\\nF0223 00:08:16.276536 6384 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotat\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:08:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-59rkm_openshift-ovn-kubernetes(66853c8a-9391-4291-b5f1-c72cb5fe23e8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-59rkm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:38Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.773606 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:38Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.788720 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:38Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.807102 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:38Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.823926 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:38Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.825740 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.826021 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.826222 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.826412 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.826643 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:38Z","lastTransitionTime":"2026-02-23T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.838228 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb9c43a6-6806-484e-aaf0-8dd620528fad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8717b726bbca97df0f82ec42ac1d3c84d9fa009f79ed5dbd4b4a0bc2ae54c896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ab7a6b4768fded1fc53121addf4b591e68d18801561691f449f38565c6e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ab7a6b4768fded1fc53121addf4b591e68d18801561691f449f38565c6e2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:38Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.863936 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:38Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.877274 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44e04dd6-c3d9-417b-aa26-2f605e07fc7b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0741d00d65832abe165aed347ca8633cb72d8d3395922dfa5615527f37d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6722b48f73153052807e8d89e0ef3bbada7ac84740e56c10f8767509b55ab5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73257ceeafdfe417dd887f970b9f16ddd188d8ed38699ec89189e451698539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd35aa0556235dde90cf7c6f1750c98352ccdfc64f6d4b9acd35efed020aeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd35aa0556235dde90cf7c6f1750c98352ccdfc64f6d4b9acd35efed020aeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:38Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.892143 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zmdgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c844ff7-f40a-43c5-bf50-dae480a77428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae14a0327db69c9cd0b4cb514bee921487ce35aaeb18535d8d44bc172fcdf59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52p6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e72aa9d3da2d55c67384b2de4aaab9402eff6a464c662ca173f18affd95f4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52p6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:08:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zmdgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:38Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.901462 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bdqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b542cb9e-35cc-44d9-a850-c41887636c4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8qrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8qrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:08:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bdqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:38Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.912913 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:38Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.929234 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.929266 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.929278 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.929293 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.929304 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:38Z","lastTransitionTime":"2026-02-23T00:08:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.930640 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:38Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:38 crc kubenswrapper[4735]: I0223 00:08:38.941648 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zhj4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3765b3d0261e124aa7d128e9bdf43a6c76939b5d63f3960ab38a129bf3476c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8649q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zhj4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:38Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.032526 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.032587 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.032609 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.032641 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.032663 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:39Z","lastTransitionTime":"2026-02-23T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.136755 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.136806 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.136823 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.136874 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.136897 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:39Z","lastTransitionTime":"2026-02-23T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.222880 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.222937 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.222954 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.222980 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.222998 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:39Z","lastTransitionTime":"2026-02-23T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:39 crc kubenswrapper[4735]: E0223 00:08:39.243960 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc670e79-a4c0-4f94-a41e-9a217a93a98f\\\",\\\"systemUUID\\\":\\\"34aea2d1-4777-4ec2-a0dd-7ee942962cf5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:39Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.249232 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.249280 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.249299 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.249324 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.249341 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:39Z","lastTransitionTime":"2026-02-23T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:39 crc kubenswrapper[4735]: E0223 00:08:39.267363 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc670e79-a4c0-4f94-a41e-9a217a93a98f\\\",\\\"systemUUID\\\":\\\"34aea2d1-4777-4ec2-a0dd-7ee942962cf5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:39Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.271327 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:08:39 crc kubenswrapper[4735]: E0223 00:08:39.271477 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.271522 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:39 crc kubenswrapper[4735]: E0223 00:08:39.272004 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.274299 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.274356 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.274374 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.274402 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.274419 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:39Z","lastTransitionTime":"2026-02-23T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:39 crc kubenswrapper[4735]: E0223 00:08:39.297256 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc670e79-a4c0-4f94-a41e-9a217a93a98f\\\",\\\"systemUUID\\\":\\\"34aea2d1-4777-4ec2-a0dd-7ee942962cf5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:39Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.303234 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.303289 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.303306 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.303327 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.303344 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:39Z","lastTransitionTime":"2026-02-23T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:39 crc kubenswrapper[4735]: E0223 00:08:39.323379 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc670e79-a4c0-4f94-a41e-9a217a93a98f\\\",\\\"systemUUID\\\":\\\"34aea2d1-4777-4ec2-a0dd-7ee942962cf5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:39Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.327900 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.327953 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.327965 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.327983 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.327994 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:39Z","lastTransitionTime":"2026-02-23T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:39 crc kubenswrapper[4735]: E0223 00:08:39.346216 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc670e79-a4c0-4f94-a41e-9a217a93a98f\\\",\\\"systemUUID\\\":\\\"34aea2d1-4777-4ec2-a0dd-7ee942962cf5\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:39Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:39 crc kubenswrapper[4735]: E0223 00:08:39.346331 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.348051 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.348075 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.348084 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.348097 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.348106 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:39Z","lastTransitionTime":"2026-02-23T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.450988 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.451048 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.451065 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.451089 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.451106 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:39Z","lastTransitionTime":"2026-02-23T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.554265 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.554311 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.554322 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.554340 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.554351 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:39Z","lastTransitionTime":"2026-02-23T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.565754 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 05:08:45.79817342 +0000 UTC Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.656807 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.657140 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.657226 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.657350 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.657443 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:39Z","lastTransitionTime":"2026-02-23T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.760542 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.760591 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.760606 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.760623 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.760635 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:39Z","lastTransitionTime":"2026-02-23T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.862613 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.862663 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.862674 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.862691 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.862704 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:39Z","lastTransitionTime":"2026-02-23T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.965027 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.965822 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.966040 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.966223 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:39 crc kubenswrapper[4735]: I0223 00:08:39.966437 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:39Z","lastTransitionTime":"2026-02-23T00:08:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.068665 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.068697 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.068721 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.068733 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.068743 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:40Z","lastTransitionTime":"2026-02-23T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.170540 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.170583 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.170598 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.170618 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.170632 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:40Z","lastTransitionTime":"2026-02-23T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.271180 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:08:40 crc kubenswrapper[4735]: E0223 00:08:40.271314 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.271178 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:08:40 crc kubenswrapper[4735]: E0223 00:08:40.271393 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.272424 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.272473 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.272489 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.272505 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.272519 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:40Z","lastTransitionTime":"2026-02-23T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.374900 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.374966 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.374983 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.375008 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.375025 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:40Z","lastTransitionTime":"2026-02-23T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.477183 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.477215 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.477224 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.477238 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.477247 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:40Z","lastTransitionTime":"2026-02-23T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.566365 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 22:39:29.827954003 +0000 UTC Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.580029 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.580076 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.580098 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.580128 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.580153 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:40Z","lastTransitionTime":"2026-02-23T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.683302 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.683363 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.683381 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.683408 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.683426 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:40Z","lastTransitionTime":"2026-02-23T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.786341 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.786409 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.786426 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.786450 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.786469 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:40Z","lastTransitionTime":"2026-02-23T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.889627 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.889690 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.889711 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.889743 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.889766 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:40Z","lastTransitionTime":"2026-02-23T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.992500 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.992556 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.992568 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.992586 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:40 crc kubenswrapper[4735]: I0223 00:08:40.992596 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:40Z","lastTransitionTime":"2026-02-23T00:08:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.096147 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.096200 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.096218 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.096241 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.096257 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:41Z","lastTransitionTime":"2026-02-23T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.199540 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.199626 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.199651 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.199681 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.199706 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:41Z","lastTransitionTime":"2026-02-23T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.271974 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.272087 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:08:41 crc kubenswrapper[4735]: E0223 00:08:41.272184 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:08:41 crc kubenswrapper[4735]: E0223 00:08:41.272331 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.302444 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.302491 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.302509 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.302533 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.302551 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:41Z","lastTransitionTime":"2026-02-23T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.405448 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.405489 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.405505 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.405525 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.405540 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:41Z","lastTransitionTime":"2026-02-23T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.508882 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.508955 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.508977 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.509009 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.509032 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:41Z","lastTransitionTime":"2026-02-23T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.567144 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 13:19:56.30964452 +0000 UTC Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.612315 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.612410 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.612433 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.612459 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.612480 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:41Z","lastTransitionTime":"2026-02-23T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.715596 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.715661 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.715679 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.715703 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.715719 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:41Z","lastTransitionTime":"2026-02-23T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.818681 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.818743 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.818760 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.818787 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.818805 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:41Z","lastTransitionTime":"2026-02-23T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.921673 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.921737 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.921755 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.921780 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:41 crc kubenswrapper[4735]: I0223 00:08:41.921798 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:41Z","lastTransitionTime":"2026-02-23T00:08:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.025079 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.025128 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.025145 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.025174 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.025192 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:42Z","lastTransitionTime":"2026-02-23T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.128122 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.128192 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.128216 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.128242 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.128259 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:42Z","lastTransitionTime":"2026-02-23T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.231203 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.231279 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.231303 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.231333 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.231357 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:42Z","lastTransitionTime":"2026-02-23T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.271262 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.271335 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:08:42 crc kubenswrapper[4735]: E0223 00:08:42.271423 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:08:42 crc kubenswrapper[4735]: E0223 00:08:42.271521 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.295077 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:42Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.313838 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4gvxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b63c18f-b6b2-4d97-b542-7800b475bd4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cd4fa9902eb9e1566216eb36f85b29b30e6a4a3f687b029c77356de5620c5e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T00:08:36Z\\\",\\\"message\\\":\\\"2026-02-23T00:07:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f4be492d-9475-498b-a2ce-b93560b9971a\\\\n2026-02-23T00:07:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f4be492d-9475-498b-a2ce-b93560b9971a to /host/opt/cni/bin/\\\\n2026-02-23T00:07:51Z [verbose] multus-daemon started\\\\n2026-02-23T00:07:51Z [verbose] Readiness Indicator file check\\\\n2026-02-23T00:08:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4pzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4gvxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:42Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.334538 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.334595 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.334616 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.334640 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.334659 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:42Z","lastTransitionTime":"2026-02-23T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.339574 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66853c8a-9391-4291-b5f1-c72cb5fe23e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c0ef57c9ddaf9bb3702b7cffe2357050374ab9cd1307b52e2dd91097b45c24d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c0ef57c9ddaf9bb3702b7cffe2357050374ab9cd1307b52e2dd91097b45c24d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T00:08:16Z\\\",\\\"message\\\":\\\"obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 00:08:16.276462 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 00:08:16.276482 6384 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0223 00:08:16.276493 6384 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0223 00:08:16.276504 6384 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 00:08:16.276531 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-26428\\\\nF0223 00:08:16.276536 6384 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotat\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:08:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-59rkm_openshift-ovn-kubernetes(66853c8a-9391-4291-b5f1-c72cb5fe23e8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-59rkm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:42Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.357657 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:42Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.374130 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:42Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.389982 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44e04dd6-c3d9-417b-aa26-2f605e07fc7b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0741d00d65832abe165aed347ca8633cb72d8d3395922dfa5615527f37d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6722b48f73153052807e8d89e0ef3bbada7ac84740e56c10f8767509b55ab5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73257ceeafdfe417dd887f970b9f16ddd188d8ed38699ec89189e451698539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd35aa0556235dde90cf7c6f1750c98352ccdfc64f6d4b9acd35efed020aeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd35aa0556235dde90cf7c6f1750c98352ccdfc64f6d4b9acd35efed020aeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:42Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.403305 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:42Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.416674 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb9c43a6-6806-484e-aaf0-8dd620528fad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8717b726bbca97df0f82ec42ac1d3c84d9fa009f79ed5dbd4b4a0bc2ae54c896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ab7a6b4768fded1fc53121addf4b591e68d18801561691f449f38565c6e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ab7a6b4768fded1fc53121addf4b591e68d18801561691f449f38565c6e2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:42Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.437638 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.437700 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.437715 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.437732 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.437745 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:42Z","lastTransitionTime":"2026-02-23T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.449006 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:42Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.463656 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zhj4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3765b3d0261e124aa7d128e9bdf43a6c76939b5d63f3960ab38a129bf3476c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8649q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zhj4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:42Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.478813 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zmdgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c844ff7-f40a-43c5-bf50-dae480a77428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae14a0327db69c9cd0b4cb514bee921487ce35aaeb18535d8d44bc172fcdf59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52p6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e72aa9d3da2d55c67384b2de4aaab9402eff6a464c662ca173f18affd95f4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52p6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:08:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zmdgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:42Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.492511 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bdqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b542cb9e-35cc-44d9-a850-c41887636c4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8qrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8qrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:08:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bdqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:42Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.509446 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:42Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.522722 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:42Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.546309 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.546706 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.546832 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.547176 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.547320 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:42Z","lastTransitionTime":"2026-02-23T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.572143 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 16:31:30.944360785 +0000 UTC Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.572678 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:42Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.593364 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3ae04d2c5dac3e689b744efd8990e7210f627cf79b77aab8cdb477e22eedeef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:42Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.617187 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29209462-90c5-4aa1-9943-d8b15ac1b5a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23e62d470b47146565ead752e6fb5578b3951831fd10126ec5ba4752e6a42e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26428\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:42Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.635799 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:42Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.651324 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.651653 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.651901 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.652111 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.652299 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:42Z","lastTransitionTime":"2026-02-23T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.658544 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:42Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.754711 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.754770 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.754789 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.754812 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.754834 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:42Z","lastTransitionTime":"2026-02-23T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.857616 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.857695 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.857718 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.857750 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.857771 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:42Z","lastTransitionTime":"2026-02-23T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.960598 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.960672 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.960690 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.960718 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:42 crc kubenswrapper[4735]: I0223 00:08:42.960736 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:42Z","lastTransitionTime":"2026-02-23T00:08:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.064843 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.064951 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.064969 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.064995 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.065016 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:43Z","lastTransitionTime":"2026-02-23T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.168175 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.168261 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.168284 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.168315 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.168339 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:43Z","lastTransitionTime":"2026-02-23T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.271146 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.271203 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.271502 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.271557 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.271584 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.271613 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.271635 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:43Z","lastTransitionTime":"2026-02-23T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:43 crc kubenswrapper[4735]: E0223 00:08:43.271697 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:08:43 crc kubenswrapper[4735]: E0223 00:08:43.271956 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.272002 4735 scope.go:117] "RemoveContainer" containerID="2c0ef57c9ddaf9bb3702b7cffe2357050374ab9cd1307b52e2dd91097b45c24d" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.374628 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.374691 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.374712 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.374740 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.374762 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:43Z","lastTransitionTime":"2026-02-23T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.478466 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.478535 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.478560 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.478586 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.478610 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:43Z","lastTransitionTime":"2026-02-23T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.572719 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 10:50:11.16426719 +0000 UTC Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.581383 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.581434 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.581451 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.581476 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.581492 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:43Z","lastTransitionTime":"2026-02-23T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.623655 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-59rkm_66853c8a-9391-4291-b5f1-c72cb5fe23e8/ovnkube-controller/2.log" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.628063 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" event={"ID":"66853c8a-9391-4291-b5f1-c72cb5fe23e8","Type":"ContainerStarted","Data":"dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33"} Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.628697 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.644342 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb9c43a6-6806-484e-aaf0-8dd620528fad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8717b726bbca97df0f82ec42ac1d3c84d9fa009f79ed5dbd4b4a0bc2ae54c896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ab7a6b4768fded1fc53121addf4b591e68d18801561691f449f38565c6e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ab7a6b4768fded1fc53121addf4b591e68d18801561691f449f38565c6e2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:43Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.677928 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:43Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.686119 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.686171 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.686195 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.686219 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.686237 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:43Z","lastTransitionTime":"2026-02-23T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.697316 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44e04dd6-c3d9-417b-aa26-2f605e07fc7b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0741d00d65832abe165aed347ca8633cb72d8d3395922dfa5615527f37d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6722b48f73153052807e8d89e0ef3bbada7ac84740e56c10f8767509b55ab5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73257ceeafdfe417dd887f970b9f16ddd188d8ed38699ec89189e451698539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd35aa0556235dde90cf7c6f1750c98352ccdfc64f6d4b9acd35efed020aeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd35aa0556235dde90cf7c6f1750c98352ccdfc64f6d4b9acd35efed020aeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:43Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.723949 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:43Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.746950 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:43Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.765319 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:43Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.781426 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zhj4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3765b3d0261e124aa7d128e9bdf43a6c76939b5d63f3960ab38a129bf3476c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8649q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zhj4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:43Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.792087 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.792202 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.792252 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.792276 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.792294 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:43Z","lastTransitionTime":"2026-02-23T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.806432 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zmdgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c844ff7-f40a-43c5-bf50-dae480a77428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae14a0327db69c9cd0b4cb514bee921487ce35aaeb18535d8d44bc172fcdf59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52p6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e72aa9d3da2d55c67384b2de4aaab9402eff6a464c662ca173f18affd95f4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52p6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:08:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zmdgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:43Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.816025 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bdqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b542cb9e-35cc-44d9-a850-c41887636c4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8qrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8qrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:08:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bdqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:43Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.840332 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:43Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.870699 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:43Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.885867 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:43Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.894190 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.894239 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.894252 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.894266 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.894275 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:43Z","lastTransitionTime":"2026-02-23T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.898814 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3ae04d2c5dac3e689b744efd8990e7210f627cf79b77aab8cdb477e22eedeef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:43Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.914007 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29209462-90c5-4aa1-9943-d8b15ac1b5a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23e62d470b47146565ead752e6fb5578b3951831fd10126ec5ba4752e6a42e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26428\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:43Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.925474 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:43Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.937627 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:43Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.952310 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:43Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.966517 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4gvxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b63c18f-b6b2-4d97-b542-7800b475bd4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cd4fa9902eb9e1566216eb36f85b29b30e6a4a3f687b029c77356de5620c5e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T00:08:36Z\\\",\\\"message\\\":\\\"2026-02-23T00:07:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f4be492d-9475-498b-a2ce-b93560b9971a\\\\n2026-02-23T00:07:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f4be492d-9475-498b-a2ce-b93560b9971a to /host/opt/cni/bin/\\\\n2026-02-23T00:07:51Z [verbose] multus-daemon started\\\\n2026-02-23T00:07:51Z [verbose] Readiness Indicator file check\\\\n2026-02-23T00:08:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4pzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4gvxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:43Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.989642 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66853c8a-9391-4291-b5f1-c72cb5fe23e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c0ef57c9ddaf9bb3702b7cffe2357050374ab9cd1307b52e2dd91097b45c24d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T00:08:16Z\\\",\\\"message\\\":\\\"obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 00:08:16.276462 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 00:08:16.276482 6384 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0223 00:08:16.276493 6384 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0223 00:08:16.276504 6384 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 00:08:16.276531 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-26428\\\\nF0223 00:08:16.276536 6384 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotat\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:08:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-59rkm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:43Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.997430 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.997487 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.997503 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.997525 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:43 crc kubenswrapper[4735]: I0223 00:08:43.997541 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:43Z","lastTransitionTime":"2026-02-23T00:08:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.101349 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.101409 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.101421 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.101456 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.101471 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:44Z","lastTransitionTime":"2026-02-23T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.203617 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.204054 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.204066 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.204080 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.204088 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:44Z","lastTransitionTime":"2026-02-23T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.272174 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:08:44 crc kubenswrapper[4735]: E0223 00:08:44.272354 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.272187 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:08:44 crc kubenswrapper[4735]: E0223 00:08:44.272713 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.310649 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.310685 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.310697 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.310713 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.310727 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:44Z","lastTransitionTime":"2026-02-23T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.414135 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.414200 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.414217 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.414248 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.414270 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:44Z","lastTransitionTime":"2026-02-23T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.517285 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.517342 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.517359 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.517381 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.517399 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:44Z","lastTransitionTime":"2026-02-23T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.573043 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 01:10:39.825446224 +0000 UTC Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.620330 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.620384 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.620401 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.620423 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.620445 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:44Z","lastTransitionTime":"2026-02-23T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.634829 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-59rkm_66853c8a-9391-4291-b5f1-c72cb5fe23e8/ovnkube-controller/3.log" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.635841 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-59rkm_66853c8a-9391-4291-b5f1-c72cb5fe23e8/ovnkube-controller/2.log" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.640341 4735 generic.go:334] "Generic (PLEG): container finished" podID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerID="dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33" exitCode=1 Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.640394 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" event={"ID":"66853c8a-9391-4291-b5f1-c72cb5fe23e8","Type":"ContainerDied","Data":"dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33"} Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.640436 4735 scope.go:117] "RemoveContainer" containerID="2c0ef57c9ddaf9bb3702b7cffe2357050374ab9cd1307b52e2dd91097b45c24d" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.643628 4735 scope.go:117] "RemoveContainer" containerID="dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33" Feb 23 00:08:44 crc kubenswrapper[4735]: E0223 00:08:44.643943 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-59rkm_openshift-ovn-kubernetes(66853c8a-9391-4291-b5f1-c72cb5fe23e8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.665909 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:44Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.689994 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:44Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.712702 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:44Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.723754 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.723808 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.723831 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.723890 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.723910 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:44Z","lastTransitionTime":"2026-02-23T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.743808 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4gvxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b63c18f-b6b2-4d97-b542-7800b475bd4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cd4fa9902eb9e1566216eb36f85b29b30e6a4a3f687b029c77356de5620c5e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T00:08:36Z\\\",\\\"message\\\":\\\"2026-02-23T00:07:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f4be492d-9475-498b-a2ce-b93560b9971a\\\\n2026-02-23T00:07:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f4be492d-9475-498b-a2ce-b93560b9971a to /host/opt/cni/bin/\\\\n2026-02-23T00:07:51Z [verbose] multus-daemon started\\\\n2026-02-23T00:07:51Z [verbose] Readiness Indicator file check\\\\n2026-02-23T00:08:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4pzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4gvxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:44Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.768713 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66853c8a-9391-4291-b5f1-c72cb5fe23e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c0ef57c9ddaf9bb3702b7cffe2357050374ab9cd1307b52e2dd91097b45c24d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T00:08:16Z\\\",\\\"message\\\":\\\"obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 00:08:16.276462 6384 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 00:08:16.276482 6384 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0223 00:08:16.276493 6384 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0223 00:08:16.276504 6384 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0223 00:08:16.276531 6384 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-26428\\\\nF0223 00:08:16.276536 6384 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotat\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:08:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T00:08:44Z\\\",\\\"message\\\":\\\"ce.beta.openshift.io/serving-cert-secret-name:networking-console-plugin-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{operator.openshift.io/v1 Network cluster 8d01ddba-7e05-4639-926a-4485de3b6327 0xc00772e57e 0xc00772e57f}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:9443,TargetPort:{1 0 https},NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app.kubernetes.io/component: networking-console-plugin,app.kubernetes.io/managed-by: cluster-network-operator,app.kubernetes.io/name: networking-console-plugin,app.kubernetes.io/part-of: cluster-network-operator,},ClusterIP:10.217.4.246,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.246],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF0223 00:08:44.385829 6775 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:08:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-59rkm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:44Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.785827 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb9c43a6-6806-484e-aaf0-8dd620528fad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8717b726bbca97df0f82ec42ac1d3c84d9fa009f79ed5dbd4b4a0bc2ae54c896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ab7a6b4768fded1fc53121addf4b591e68d18801561691f449f38565c6e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ab7a6b4768fded1fc53121addf4b591e68d18801561691f449f38565c6e2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:44Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.819812 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:44Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.827104 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.827143 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.827153 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.827170 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.827181 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:44Z","lastTransitionTime":"2026-02-23T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.838189 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44e04dd6-c3d9-417b-aa26-2f605e07fc7b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0741d00d65832abe165aed347ca8633cb72d8d3395922dfa5615527f37d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6722b48f73153052807e8d89e0ef3bbada7ac84740e56c10f8767509b55ab5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73257ceeafdfe417dd887f970b9f16ddd188d8ed38699ec89189e451698539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd35aa0556235dde90cf7c6f1750c98352ccdfc64f6d4b9acd35efed020aeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd35aa0556235dde90cf7c6f1750c98352ccdfc64f6d4b9acd35efed020aeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:44Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.852290 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:44Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.867198 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:44Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.886391 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:44Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.900980 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zhj4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3765b3d0261e124aa7d128e9bdf43a6c76939b5d63f3960ab38a129bf3476c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8649q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zhj4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:44Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.917690 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zmdgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c844ff7-f40a-43c5-bf50-dae480a77428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae14a0327db69c9cd0b4cb514bee921487ce35aaeb18535d8d44bc172fcdf59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52p6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e72aa9d3da2d55c67384b2de4aaab9402eff6a464c662ca173f18affd95f4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52p6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:08:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zmdgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:44Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.930590 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bdqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b542cb9e-35cc-44d9-a850-c41887636c4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8qrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8qrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:08:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bdqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:44Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.933006 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.933071 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.933094 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.933125 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.933148 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:44Z","lastTransitionTime":"2026-02-23T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.951396 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:44Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.973273 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:44Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:44 crc kubenswrapper[4735]: I0223 00:08:44.990406 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:44Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.006072 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3ae04d2c5dac3e689b744efd8990e7210f627cf79b77aab8cdb477e22eedeef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:45Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.024935 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29209462-90c5-4aa1-9943-d8b15ac1b5a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23e62d470b47146565ead752e6fb5578b3951831fd10126ec5ba4752e6a42e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26428\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:45Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.036573 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.036611 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.036623 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.036640 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.036652 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:45Z","lastTransitionTime":"2026-02-23T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.139283 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.139320 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.139330 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.139344 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.139356 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:45Z","lastTransitionTime":"2026-02-23T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.242843 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.242900 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.242912 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.242928 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.242941 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:45Z","lastTransitionTime":"2026-02-23T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.271756 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:08:45 crc kubenswrapper[4735]: E0223 00:08:45.271955 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.272190 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:45 crc kubenswrapper[4735]: E0223 00:08:45.272483 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.346455 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.346512 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.346528 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.346549 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.346567 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:45Z","lastTransitionTime":"2026-02-23T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.450075 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.450120 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.450137 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.450158 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.450177 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:45Z","lastTransitionTime":"2026-02-23T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.533113 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.533283 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:08:45 crc kubenswrapper[4735]: E0223 00:08:45.533372 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:49.533305442 +0000 UTC m=+147.996851453 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:08:45 crc kubenswrapper[4735]: E0223 00:08:45.533456 4735 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 00:08:45 crc kubenswrapper[4735]: E0223 00:08:45.533557 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 00:09:49.533536058 +0000 UTC m=+147.997082059 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.533598 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:08:45 crc kubenswrapper[4735]: E0223 00:08:45.533750 4735 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 00:08:45 crc kubenswrapper[4735]: E0223 00:08:45.533797 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-23 00:09:49.533783534 +0000 UTC m=+147.997329535 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.553114 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.553157 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.553172 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.553195 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.553212 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:45Z","lastTransitionTime":"2026-02-23T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.573249 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 13:27:54.61080039 +0000 UTC Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.634101 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.634627 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:08:45 crc kubenswrapper[4735]: E0223 00:08:45.634543 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 00:08:45 crc kubenswrapper[4735]: E0223 00:08:45.635123 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 00:08:45 crc kubenswrapper[4735]: E0223 00:08:45.635308 4735 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 00:08:45 crc kubenswrapper[4735]: E0223 00:08:45.635510 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-23 00:09:49.635486531 +0000 UTC m=+148.099032532 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 00:08:45 crc kubenswrapper[4735]: E0223 00:08:45.634908 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 23 00:08:45 crc kubenswrapper[4735]: E0223 00:08:45.636045 4735 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 23 00:08:45 crc kubenswrapper[4735]: E0223 00:08:45.636161 4735 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 00:08:45 crc kubenswrapper[4735]: E0223 00:08:45.636334 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-23 00:09:49.636315861 +0000 UTC m=+148.099861862 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.648471 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-59rkm_66853c8a-9391-4291-b5f1-c72cb5fe23e8/ovnkube-controller/3.log" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.654959 4735 scope.go:117] "RemoveContainer" containerID="dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33" Feb 23 00:08:45 crc kubenswrapper[4735]: E0223 00:08:45.655270 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-59rkm_openshift-ovn-kubernetes(66853c8a-9391-4291-b5f1-c72cb5fe23e8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.656884 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.656952 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.656977 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.657008 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.657033 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:45Z","lastTransitionTime":"2026-02-23T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.676602 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a35acbfb5836e5ddac27f7e140e64897f9f2ad3615e4db66b091b79744c699d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c4f2f9d3265d84be4a241b5a2a0193896ad4e9daeec1aaca6fe4fef0be21c88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:45Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.693744 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-zhj4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1dd002d-a62b-432e-bc82-21ffdfc5e0b7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3765b3d0261e124aa7d128e9bdf43a6c76939b5d63f3960ab38a129bf3476c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8649q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:51Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-zhj4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:45Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.713393 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zmdgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c844ff7-f40a-43c5-bf50-dae480a77428\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae14a0327db69c9cd0b4cb514bee921487ce35aaeb18535d8d44bc172fcdf59d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52p6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e72aa9d3da2d55c67384b2de4aaab9402eff6a464c662ca173f18affd95f4a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-52p6d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:08:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zmdgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:45Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.730445 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-bdqfd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b542cb9e-35cc-44d9-a850-c41887636c4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:04Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8qrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8qrl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:08:04Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-bdqfd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:45Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.751117 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:45Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.760548 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.760613 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.760635 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.760667 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.760689 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:45Z","lastTransitionTime":"2026-02-23T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.772705 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:42Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://340ce9e1f14e0dc6339a25b1bd23f58d9790656cfea764bac921d60603920cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:45Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.792744 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04edd9324344faf24b9181705097f238050d69e14bf46da2f76f3d114552c942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:45Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.810975 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1cba474f-2d55-4a07-969f-25e2817a06d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3ae04d2c5dac3e689b744efd8990e7210f627cf79b77aab8cdb477e22eedeef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wp46c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-blmnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:45Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.835502 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-26428" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"29209462-90c5-4aa1-9943-d8b15ac1b5a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23e62d470b47146565ead752e6fb5578b3951831fd10126ec5ba4752e6a42e66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79dce74fd559025b2189707a12ccebd8f3b4c7272404da7c2894afbd93e89ec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://daed10f742736584cbcc990cad2b90bc7369ef57e070a0833edd32c91caee0a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7c6bb41d75aae664831e015804e908fa90f67333072bc6b9da60558203b1dec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://13db4202da55f79e3ee56a33e4f7097f05f5df0f04a0338162d31f2cbe06c214\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0a76083e8ecc82d4bc89ef3b9d15d6cf0e7cc5b595ccb93cace248394b1cc9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd13dbac6516ffc4a0d95b04640cb0fd1cfaaac9f4e044f94380209da4a540e8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x6sqn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-26428\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:45Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.858491 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1153ca78-e3e2-458f-8a31-e16738c1677c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43f8fa8d2d76248580ea4800c8e0e97c060feb42b7c78fa9be3f5adbeaaf9437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd23f1d94c143d946e301f9af58c6fe189e1e95ef05c2648ad167acab65726df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfb8e52070bd8c7a9784d6f81037eee627297839cf49ec88b861a6d8f3a7a16e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:45Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.866780 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.866906 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.866939 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.866973 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.866999 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:45Z","lastTransitionTime":"2026-02-23T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.880031 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:45Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.901077 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:45Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.922688 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-4gvxr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b63c18f-b6b2-4d97-b542-7800b475bd4c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5cd4fa9902eb9e1566216eb36f85b29b30e6a4a3f687b029c77356de5620c5e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T00:08:36Z\\\",\\\"message\\\":\\\"2026-02-23T00:07:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f4be492d-9475-498b-a2ce-b93560b9971a\\\\n2026-02-23T00:07:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f4be492d-9475-498b-a2ce-b93560b9971a to /host/opt/cni/bin/\\\\n2026-02-23T00:07:51Z [verbose] multus-daemon started\\\\n2026-02-23T00:07:51Z [verbose] Readiness Indicator file check\\\\n2026-02-23T00:08:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:08:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-n4pzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-multus\"/\"multus-4gvxr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:45Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.953660 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66853c8a-9391-4291-b5f1-c72cb5fe23e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-23T00:08:44Z\\\",\\\"message\\\":\\\"ce.beta.openshift.io/serving-cert-secret-name:networking-console-plugin-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{operator.openshift.io/v1 Network cluster 8d01ddba-7e05-4639-926a-4485de3b6327 0xc00772e57e 0xc00772e57f}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:https,Protocol:TCP,Port:9443,TargetPort:{1 0 https},NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app.kubernetes.io/component: networking-console-plugin,app.kubernetes.io/managed-by: cluster-network-operator,app.kubernetes.io/name: networking-console-plugin,app.kubernetes.io/part-of: cluster-network-operator,},ClusterIP:10.217.4.246,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.246],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF0223 00:08:44.385829 6775 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:08:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-59rkm_openshift-ovn-kubernetes(66853c8a-9391-4291-b5f1-c72cb5fe23e8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tzvft\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:49Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-59rkm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:45Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.969805 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.969927 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.969953 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.969986 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.970010 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:45Z","lastTransitionTime":"2026-02-23T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:45 crc kubenswrapper[4735]: I0223 00:08:45.977727 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40cc14f8-4b0d-4268-82bc-e9c2d8073cf3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-23T00:07:41Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0223 00:07:35.797171 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0223 00:07:35.799032 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1411013438/tls.crt::/tmp/serving-cert-1411013438/tls.key\\\\\\\"\\\\nI0223 00:07:41.683988 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0223 00:07:41.688343 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0223 00:07:41.688375 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0223 00:07:41.688410 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0223 00:07:41.688422 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0223 00:07:41.698776 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0223 00:07:41.698820 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698831 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0223 00:07:41.698840 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0223 00:07:41.698875 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0223 00:07:41.698881 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0223 00:07:41.698924 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0223 00:07:41.699038 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0223 00:07:41.701822 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:45Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.012493 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"20a3f0f8-dc0e-4c2c-95ac-2ef9874425f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1d2ce9c61dc8a1fbdfaec9af084753a45301d36efe79584d66b0755995e69962\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5389ef753364210ff8b309a1e7999559575f83dd6101d3651b13ab77fb5e7e88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa49aea5e599d416e66a3b7c92d2a85b665ce37579763173ebb227809d20e20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2cce546bb1dd122873d7458b320df7f94c97a44748ca82c301289e0d599c2bca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://38fa98cd762271a7cbb427044ee1e79f238303650293e977808a5e26c89b8bcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1992992c7d1605e860925b4374d2a0474bfde2039ae65385872daabc7afbae10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3af0a6c09e6d6c669bf7b3faae57843585d3866b616cee673b5c6426ae47d97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b23b046baba50661f976aea230d17b117b64bb45daaca0ba4773710ce31dbaea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:46Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.031806 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44e04dd6-c3d9-417b-aa26-2f605e07fc7b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4dd0741d00d65832abe165aed347ca8633cb72d8d3395922dfa5615527f37d55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6722b48f73153052807e8d89e0ef3bbada7ac84740e56c10f8767509b55ab5c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b73257ceeafdfe417dd887f970b9f16ddd188d8ed38699ec89189e451698539\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fd35aa0556235dde90cf7c6f1750c98352ccdfc64f6d4b9acd35efed020aeb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fd35aa0556235dde90cf7c6f1750c98352ccdfc64f6d4b9acd35efed020aeb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:46Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.046379 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-jfk7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c2131b2-ccbc-49ff-a0bc-fd6639563dd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0d7eacf1e5180d67f73b2e794d7a0a995619d05d1019a3e152e3fa4b7c4c369\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r5g8g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:48Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-jfk7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:46Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.060905 4735 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fb9c43a6-6806-484e-aaf0-8dd620528fad\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-23T00:07:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8717b726bbca97df0f82ec42ac1d3c84d9fa009f79ed5dbd4b4a0bc2ae54c896\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-23T00:07:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://880ab7a6b4768fded1fc53121addf4b591e68d18801561691f449f38565c6e2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://880ab7a6b4768fded1fc53121addf4b591e68d18801561691f449f38565c6e2c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-23T00:07:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-23T00:07:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-23T00:07:22Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:46Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.073283 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.073319 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.073334 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.073352 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.073367 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:46Z","lastTransitionTime":"2026-02-23T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.176578 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.176610 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.176621 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.176637 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.176648 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:46Z","lastTransitionTime":"2026-02-23T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.271924 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.271957 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:08:46 crc kubenswrapper[4735]: E0223 00:08:46.272097 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:08:46 crc kubenswrapper[4735]: E0223 00:08:46.272254 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.282139 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.282225 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.282253 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.282308 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.282333 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:46Z","lastTransitionTime":"2026-02-23T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.385761 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.385842 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.385895 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.385920 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.385937 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:46Z","lastTransitionTime":"2026-02-23T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.489377 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.489478 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.489509 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.489539 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.489562 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:46Z","lastTransitionTime":"2026-02-23T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.574006 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 03:08:31.351141893 +0000 UTC Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.592734 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.592792 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.592810 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.592838 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.592889 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:46Z","lastTransitionTime":"2026-02-23T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.695648 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.695785 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.695812 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.695840 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.695901 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:46Z","lastTransitionTime":"2026-02-23T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.800270 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.800331 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.800347 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.800372 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.800389 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:46Z","lastTransitionTime":"2026-02-23T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.904172 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.904569 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.904785 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.905055 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:46 crc kubenswrapper[4735]: I0223 00:08:46.905269 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:46Z","lastTransitionTime":"2026-02-23T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.008416 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.008460 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.008477 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.008499 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.008516 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:47Z","lastTransitionTime":"2026-02-23T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.112262 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.112323 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.112342 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.112367 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.112384 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:47Z","lastTransitionTime":"2026-02-23T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.214907 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.214976 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.215005 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.215068 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.215104 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:47Z","lastTransitionTime":"2026-02-23T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.272158 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.272231 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:08:47 crc kubenswrapper[4735]: E0223 00:08:47.272351 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:08:47 crc kubenswrapper[4735]: E0223 00:08:47.272829 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.317519 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.317582 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.317599 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.317621 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.317637 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:47Z","lastTransitionTime":"2026-02-23T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.420032 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.420089 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.420105 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.420131 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.420148 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:47Z","lastTransitionTime":"2026-02-23T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.523087 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.523164 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.523187 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.523214 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.523235 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:47Z","lastTransitionTime":"2026-02-23T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.574824 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 10:19:35.786793651 +0000 UTC Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.626006 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.626055 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.626072 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.626094 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.626110 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:47Z","lastTransitionTime":"2026-02-23T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.728929 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.729019 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.729035 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.729058 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.729076 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:47Z","lastTransitionTime":"2026-02-23T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.832135 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.832193 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.832217 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.832247 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.832268 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:47Z","lastTransitionTime":"2026-02-23T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.934966 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.935032 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.935053 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.935079 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:47 crc kubenswrapper[4735]: I0223 00:08:47.935097 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:47Z","lastTransitionTime":"2026-02-23T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.037807 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.037903 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.037922 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.037945 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.037962 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:48Z","lastTransitionTime":"2026-02-23T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.141290 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.141367 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.141389 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.141415 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.141436 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:48Z","lastTransitionTime":"2026-02-23T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.244148 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.244200 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.244222 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.244249 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.244271 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:48Z","lastTransitionTime":"2026-02-23T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.271916 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.271945 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:08:48 crc kubenswrapper[4735]: E0223 00:08:48.272441 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:08:48 crc kubenswrapper[4735]: E0223 00:08:48.272292 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.346777 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.347253 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.347272 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.347298 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.347317 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:48Z","lastTransitionTime":"2026-02-23T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.450215 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.450277 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.450293 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.450317 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.450333 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:48Z","lastTransitionTime":"2026-02-23T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.553399 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.553472 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.553488 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.553512 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.553528 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:48Z","lastTransitionTime":"2026-02-23T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.575718 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 11:07:56.802764299 +0000 UTC Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.655939 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.656000 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.656018 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.656043 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.656060 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:48Z","lastTransitionTime":"2026-02-23T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.759376 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.759452 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.759472 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.759496 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.759513 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:48Z","lastTransitionTime":"2026-02-23T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.863349 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.863669 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.863875 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.864039 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.864174 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:48Z","lastTransitionTime":"2026-02-23T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.968202 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.968268 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.968288 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.968313 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:48 crc kubenswrapper[4735]: I0223 00:08:48.968329 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:48Z","lastTransitionTime":"2026-02-23T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.071967 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.072520 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.072709 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.072926 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.073302 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:49Z","lastTransitionTime":"2026-02-23T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.175832 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.176277 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.176585 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.176803 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.177003 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:49Z","lastTransitionTime":"2026-02-23T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.272170 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.273123 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:08:49 crc kubenswrapper[4735]: E0223 00:08:49.273306 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:08:49 crc kubenswrapper[4735]: E0223 00:08:49.273421 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.279565 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.279613 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.279631 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.279654 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.279672 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:49Z","lastTransitionTime":"2026-02-23T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.382125 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.382178 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.382195 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.382218 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.382235 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:49Z","lastTransitionTime":"2026-02-23T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.484934 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.484984 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.485002 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.485023 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.485039 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:49Z","lastTransitionTime":"2026-02-23T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.575889 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 01:39:30.588470439 +0000 UTC Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.587579 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.587635 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.587652 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.587676 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.587693 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:49Z","lastTransitionTime":"2026-02-23T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.663984 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.664052 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.664072 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.664099 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.664121 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:49Z","lastTransitionTime":"2026-02-23T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:49 crc kubenswrapper[4735]: E0223 00:08:49.677807 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc670e79-a4c0-4f94-a41e-9a217a93a98f\\\",\\\"systemUUID\\\":\\\"34aea2d1-4777-4ec2-a0dd-7ee942962cf5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.682341 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.682377 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.682394 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.682417 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.682434 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:49Z","lastTransitionTime":"2026-02-23T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:49 crc kubenswrapper[4735]: E0223 00:08:49.695540 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc670e79-a4c0-4f94-a41e-9a217a93a98f\\\",\\\"systemUUID\\\":\\\"34aea2d1-4777-4ec2-a0dd-7ee942962cf5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.700765 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.700878 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.700914 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.700949 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.700973 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:49Z","lastTransitionTime":"2026-02-23T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:49 crc kubenswrapper[4735]: E0223 00:08:49.713543 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc670e79-a4c0-4f94-a41e-9a217a93a98f\\\",\\\"systemUUID\\\":\\\"34aea2d1-4777-4ec2-a0dd-7ee942962cf5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.718426 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.718478 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.718498 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.718519 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.718538 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:49Z","lastTransitionTime":"2026-02-23T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:49 crc kubenswrapper[4735]: E0223 00:08:49.736115 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc670e79-a4c0-4f94-a41e-9a217a93a98f\\\",\\\"systemUUID\\\":\\\"34aea2d1-4777-4ec2-a0dd-7ee942962cf5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.740750 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.740809 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.740826 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.740886 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.740907 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:49Z","lastTransitionTime":"2026-02-23T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:49 crc kubenswrapper[4735]: E0223 00:08:49.760831 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-23T00:08:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"fc670e79-a4c0-4f94-a41e-9a217a93a98f\\\",\\\"systemUUID\\\":\\\"34aea2d1-4777-4ec2-a0dd-7ee942962cf5\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-23T00:08:49Z is after 2025-08-24T17:21:41Z" Feb 23 00:08:49 crc kubenswrapper[4735]: E0223 00:08:49.761086 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.763334 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.763395 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.763420 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.763452 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.763475 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:49Z","lastTransitionTime":"2026-02-23T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.866387 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.866444 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.866470 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.866497 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.866520 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:49Z","lastTransitionTime":"2026-02-23T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.969756 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.969796 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.969813 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.969834 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:49 crc kubenswrapper[4735]: I0223 00:08:49.969894 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:49Z","lastTransitionTime":"2026-02-23T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.072200 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.072248 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.072268 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.072288 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.072304 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:50Z","lastTransitionTime":"2026-02-23T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.174779 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.174846 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.174896 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.174922 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.174941 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:50Z","lastTransitionTime":"2026-02-23T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.271186 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.271291 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:08:50 crc kubenswrapper[4735]: E0223 00:08:50.271341 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:08:50 crc kubenswrapper[4735]: E0223 00:08:50.271491 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.278130 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.278170 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.278188 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.278209 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.278225 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:50Z","lastTransitionTime":"2026-02-23T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.381938 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.382274 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.382406 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.382536 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.382660 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:50Z","lastTransitionTime":"2026-02-23T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.486036 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.486105 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.486123 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.486149 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.486167 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:50Z","lastTransitionTime":"2026-02-23T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.576354 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 11:28:13.397464229 +0000 UTC Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.589322 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.589563 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.589723 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.589919 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.590078 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:50Z","lastTransitionTime":"2026-02-23T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.692698 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.693127 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.693295 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.693440 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.693696 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:50Z","lastTransitionTime":"2026-02-23T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.796541 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.797082 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.797247 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.797413 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.797561 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:50Z","lastTransitionTime":"2026-02-23T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.901031 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.901095 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.901122 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.901153 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:50 crc kubenswrapper[4735]: I0223 00:08:50.901175 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:50Z","lastTransitionTime":"2026-02-23T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.003958 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.004017 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.004037 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.004062 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.004079 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:51Z","lastTransitionTime":"2026-02-23T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.107120 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.107183 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.107202 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.107228 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.107250 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:51Z","lastTransitionTime":"2026-02-23T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.210474 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.210531 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.210553 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.210584 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.210606 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:51Z","lastTransitionTime":"2026-02-23T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.272063 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.272085 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:51 crc kubenswrapper[4735]: E0223 00:08:51.272262 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:08:51 crc kubenswrapper[4735]: E0223 00:08:51.272568 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.313252 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.313313 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.313332 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.313355 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.313371 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:51Z","lastTransitionTime":"2026-02-23T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.416887 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.416932 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.416948 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.416974 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.416994 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:51Z","lastTransitionTime":"2026-02-23T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.520361 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.520404 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.520421 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.520444 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.520463 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:51Z","lastTransitionTime":"2026-02-23T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.577733 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 06:25:53.637576695 +0000 UTC Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.622935 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.623221 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.623362 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.623489 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.623627 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:51Z","lastTransitionTime":"2026-02-23T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.726938 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.727003 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.727020 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.727042 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.727059 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:51Z","lastTransitionTime":"2026-02-23T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.830420 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.830798 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.830999 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.831175 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.831345 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:51Z","lastTransitionTime":"2026-02-23T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.934902 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.934950 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.934967 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.934989 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:51 crc kubenswrapper[4735]: I0223 00:08:51.935006 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:51Z","lastTransitionTime":"2026-02-23T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.038161 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.038223 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.038248 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.038278 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.038302 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:52Z","lastTransitionTime":"2026-02-23T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.140633 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.140692 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.140713 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.140740 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.140762 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:52Z","lastTransitionTime":"2026-02-23T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.243948 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.244011 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.244036 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.244072 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.244095 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:52Z","lastTransitionTime":"2026-02-23T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.271760 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.271819 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:08:52 crc kubenswrapper[4735]: E0223 00:08:52.272160 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:08:52 crc kubenswrapper[4735]: E0223 00:08:52.272408 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.338294 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=70.338268431 podStartE2EDuration="1m10.338268431s" podCreationTimestamp="2026-02-23 00:07:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:08:52.316977329 +0000 UTC m=+90.780523360" watchObservedRunningTime="2026-02-23 00:08:52.338268431 +0000 UTC m=+90.801814442" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.346393 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.346437 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.346455 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.346477 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.346496 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:52Z","lastTransitionTime":"2026-02-23T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.434329 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-4gvxr" podStartSLOduration=64.434293112 podStartE2EDuration="1m4.434293112s" podCreationTimestamp="2026-02-23 00:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:08:52.387485376 +0000 UTC m=+90.851031377" watchObservedRunningTime="2026-02-23 00:08:52.434293112 +0000 UTC m=+90.897839133" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.449809 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.449887 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.449927 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.449945 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.449959 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:52Z","lastTransitionTime":"2026-02-23T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.452242 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=21.452199593 podStartE2EDuration="21.452199593s" podCreationTimestamp="2026-02-23 00:08:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:08:52.45208808 +0000 UTC m=+90.915634081" watchObservedRunningTime="2026-02-23 00:08:52.452199593 +0000 UTC m=+90.915745604" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.489767 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=72.489741397 podStartE2EDuration="1m12.489741397s" podCreationTimestamp="2026-02-23 00:07:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:08:52.48942789 +0000 UTC m=+90.952973921" watchObservedRunningTime="2026-02-23 00:08:52.489741397 +0000 UTC m=+90.953287388" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.509171 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=39.509151734 podStartE2EDuration="39.509151734s" podCreationTimestamp="2026-02-23 00:08:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:08:52.508840647 +0000 UTC m=+90.972386658" watchObservedRunningTime="2026-02-23 00:08:52.509151734 +0000 UTC m=+90.972697715" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.534824 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-jfk7q" podStartSLOduration=64.534803651 podStartE2EDuration="1m4.534803651s" podCreationTimestamp="2026-02-23 00:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:08:52.534038044 +0000 UTC m=+90.997584025" watchObservedRunningTime="2026-02-23 00:08:52.534803651 +0000 UTC m=+90.998349632" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.552550 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.552891 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.553018 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.553110 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.553194 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:52Z","lastTransitionTime":"2026-02-23T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.578198 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 11:22:57.896840342 +0000 UTC Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.602731 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-zhj4f" podStartSLOduration=64.602708906 podStartE2EDuration="1m4.602708906s" podCreationTimestamp="2026-02-23 00:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:08:52.602660805 +0000 UTC m=+91.066206796" watchObservedRunningTime="2026-02-23 00:08:52.602708906 +0000 UTC m=+91.066254877" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.633290 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zmdgs" podStartSLOduration=63.633266722 podStartE2EDuration="1m3.633266722s" podCreationTimestamp="2026-02-23 00:07:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:08:52.619797568 +0000 UTC m=+91.083343569" watchObservedRunningTime="2026-02-23 00:08:52.633266722 +0000 UTC m=+91.096812713" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.654659 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=68.654639866 podStartE2EDuration="1m8.654639866s" podCreationTimestamp="2026-02-23 00:07:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:08:52.65399871 +0000 UTC m=+91.117544681" watchObservedRunningTime="2026-02-23 00:08:52.654639866 +0000 UTC m=+91.118185847" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.655801 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.655827 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.655839 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.655869 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.655882 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:52Z","lastTransitionTime":"2026-02-23T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.707047 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" podStartSLOduration=64.707025137 podStartE2EDuration="1m4.707025137s" podCreationTimestamp="2026-02-23 00:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:08:52.706809292 +0000 UTC m=+91.170355273" watchObservedRunningTime="2026-02-23 00:08:52.707025137 +0000 UTC m=+91.170571108" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.729197 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-26428" podStartSLOduration=64.72917057 podStartE2EDuration="1m4.72917057s" podCreationTimestamp="2026-02-23 00:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:08:52.72872851 +0000 UTC m=+91.192274491" watchObservedRunningTime="2026-02-23 00:08:52.72917057 +0000 UTC m=+91.192716581" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.757535 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.757575 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.757584 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.757600 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.757609 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:52Z","lastTransitionTime":"2026-02-23T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.860700 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.860770 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.860792 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.860818 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.860837 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:52Z","lastTransitionTime":"2026-02-23T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.963490 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.963531 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.963550 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.963567 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:52 crc kubenswrapper[4735]: I0223 00:08:52.963577 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:52Z","lastTransitionTime":"2026-02-23T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.066562 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.066612 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.066629 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.066653 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.066670 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:53Z","lastTransitionTime":"2026-02-23T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.170168 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.170232 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.170251 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.170276 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.170294 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:53Z","lastTransitionTime":"2026-02-23T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.271242 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.271268 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:08:53 crc kubenswrapper[4735]: E0223 00:08:53.271429 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:08:53 crc kubenswrapper[4735]: E0223 00:08:53.271700 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.273095 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.273141 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.273159 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.273182 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.273200 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:53Z","lastTransitionTime":"2026-02-23T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.377461 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.377526 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.377546 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.377570 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.377590 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:53Z","lastTransitionTime":"2026-02-23T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.481053 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.481142 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.481161 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.481649 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.481707 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:53Z","lastTransitionTime":"2026-02-23T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.578340 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 02:16:26.045848931 +0000 UTC Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.584670 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.584731 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.584749 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.584766 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.584779 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:53Z","lastTransitionTime":"2026-02-23T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.687833 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.688230 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.688376 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.688524 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.688667 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:53Z","lastTransitionTime":"2026-02-23T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.791802 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.791967 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.791988 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.792013 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.792035 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:53Z","lastTransitionTime":"2026-02-23T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.895438 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.895499 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.895516 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.895539 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.895556 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:53Z","lastTransitionTime":"2026-02-23T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.998778 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.998829 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.998901 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.998932 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:53 crc kubenswrapper[4735]: I0223 00:08:53.998953 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:53Z","lastTransitionTime":"2026-02-23T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.101346 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.101406 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.101430 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.101462 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.101482 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:54Z","lastTransitionTime":"2026-02-23T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.207194 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.207260 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.207284 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.207316 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.207341 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:54Z","lastTransitionTime":"2026-02-23T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.271733 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:08:54 crc kubenswrapper[4735]: E0223 00:08:54.271885 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.271979 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:08:54 crc kubenswrapper[4735]: E0223 00:08:54.272164 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.310261 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.310411 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.310434 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.310458 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.310475 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:54Z","lastTransitionTime":"2026-02-23T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.413209 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.413242 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.413251 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.413262 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.413271 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:54Z","lastTransitionTime":"2026-02-23T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.516293 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.516329 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.516340 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.516355 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.516367 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:54Z","lastTransitionTime":"2026-02-23T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.579236 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 08:50:20.123236702 +0000 UTC Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.619687 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.619746 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.619767 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.619791 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.619810 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:54Z","lastTransitionTime":"2026-02-23T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.722584 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.722657 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.722685 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.722713 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.722733 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:54Z","lastTransitionTime":"2026-02-23T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.826357 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.826412 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.826429 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.826451 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.826468 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:54Z","lastTransitionTime":"2026-02-23T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.929281 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.929332 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.929348 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.929372 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:54 crc kubenswrapper[4735]: I0223 00:08:54.929391 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:54Z","lastTransitionTime":"2026-02-23T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.031941 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.032002 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.032019 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.032041 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.032059 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:55Z","lastTransitionTime":"2026-02-23T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.135774 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.135847 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.135903 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.135934 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.135956 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:55Z","lastTransitionTime":"2026-02-23T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.238349 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.238405 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.238419 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.238438 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.238453 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:55Z","lastTransitionTime":"2026-02-23T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.271226 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.271233 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:55 crc kubenswrapper[4735]: E0223 00:08:55.271453 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:08:55 crc kubenswrapper[4735]: E0223 00:08:55.271617 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.340985 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.341053 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.341070 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.341092 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.341110 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:55Z","lastTransitionTime":"2026-02-23T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.444581 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.444631 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.444645 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.444662 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.444673 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:55Z","lastTransitionTime":"2026-02-23T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.547681 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.547758 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.547786 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.547813 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.547830 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:55Z","lastTransitionTime":"2026-02-23T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.579780 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 02:14:05.610768633 +0000 UTC Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.651308 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.651378 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.651402 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.651436 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.651460 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:55Z","lastTransitionTime":"2026-02-23T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.753807 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.753934 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.753962 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.753993 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.754015 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:55Z","lastTransitionTime":"2026-02-23T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.856758 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.856904 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.856960 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.856995 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.857018 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:55Z","lastTransitionTime":"2026-02-23T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.960211 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.960264 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.960281 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.960305 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:55 crc kubenswrapper[4735]: I0223 00:08:55.960322 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:55Z","lastTransitionTime":"2026-02-23T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.063250 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.063302 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.063319 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.063343 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.063358 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:56Z","lastTransitionTime":"2026-02-23T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.165918 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.165966 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.165977 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.165989 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.165998 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:56Z","lastTransitionTime":"2026-02-23T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.268842 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.268941 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.268963 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.269052 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.269082 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:56Z","lastTransitionTime":"2026-02-23T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.271547 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.271563 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:08:56 crc kubenswrapper[4735]: E0223 00:08:56.271728 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:08:56 crc kubenswrapper[4735]: E0223 00:08:56.271840 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.371688 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.371756 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.371779 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.371806 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.371828 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:56Z","lastTransitionTime":"2026-02-23T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.475261 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.475322 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.475334 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.475352 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.475364 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:56Z","lastTransitionTime":"2026-02-23T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.576977 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.577057 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.577076 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.577098 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.577114 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:56Z","lastTransitionTime":"2026-02-23T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.580294 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 20:34:51.838740319 +0000 UTC Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.680067 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.680140 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.680164 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.680191 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.680215 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:56Z","lastTransitionTime":"2026-02-23T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.783420 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.783496 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.783520 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.783545 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.783563 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:56Z","lastTransitionTime":"2026-02-23T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.886169 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.886249 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.886273 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.886299 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.886326 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:56Z","lastTransitionTime":"2026-02-23T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.990166 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.990233 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.990259 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.990288 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:56 crc kubenswrapper[4735]: I0223 00:08:56.990306 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:56Z","lastTransitionTime":"2026-02-23T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.094111 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.094191 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.094214 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.094246 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.094271 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:57Z","lastTransitionTime":"2026-02-23T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.198205 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.198291 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.198315 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.198346 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.198376 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:57Z","lastTransitionTime":"2026-02-23T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.271405 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.271425 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:08:57 crc kubenswrapper[4735]: E0223 00:08:57.271604 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:08:57 crc kubenswrapper[4735]: E0223 00:08:57.271687 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.300970 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.301045 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.301069 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.301099 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.301121 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:57Z","lastTransitionTime":"2026-02-23T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.404319 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.404395 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.404416 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.404443 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.404464 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:57Z","lastTransitionTime":"2026-02-23T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.507325 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.507399 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.507417 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.507440 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.507457 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:57Z","lastTransitionTime":"2026-02-23T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.581450 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 06:28:21.028280958 +0000 UTC Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.610128 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.610186 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.610202 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.610225 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.610242 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:57Z","lastTransitionTime":"2026-02-23T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.712817 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.712926 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.712953 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.712987 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.713011 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:57Z","lastTransitionTime":"2026-02-23T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.815420 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.815487 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.815509 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.815535 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.815556 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:57Z","lastTransitionTime":"2026-02-23T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.918129 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.918191 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.918209 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.918233 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:57 crc kubenswrapper[4735]: I0223 00:08:57.918252 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:57Z","lastTransitionTime":"2026-02-23T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.022228 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.022294 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.022312 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.022332 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.022352 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:58Z","lastTransitionTime":"2026-02-23T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.125101 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.125154 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.125169 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.125193 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.125211 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:58Z","lastTransitionTime":"2026-02-23T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.227844 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.228000 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.228019 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.228041 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.228058 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:58Z","lastTransitionTime":"2026-02-23T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.271584 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:08:58 crc kubenswrapper[4735]: E0223 00:08:58.271807 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.271595 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:08:58 crc kubenswrapper[4735]: E0223 00:08:58.272036 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.330490 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.330570 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.330593 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.330629 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.330653 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:58Z","lastTransitionTime":"2026-02-23T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.433428 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.433514 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.433539 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.433578 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.433603 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:58Z","lastTransitionTime":"2026-02-23T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.536370 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.536462 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.536487 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.536518 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.536540 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:58Z","lastTransitionTime":"2026-02-23T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.581564 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 12:12:18.29721364 +0000 UTC Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.639823 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.639960 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.639988 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.640017 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.640041 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:58Z","lastTransitionTime":"2026-02-23T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.742293 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.742399 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.742421 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.742450 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.742472 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:58Z","lastTransitionTime":"2026-02-23T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.846189 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.846240 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.846258 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.846282 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.846298 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:58Z","lastTransitionTime":"2026-02-23T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.949337 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.949394 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.949412 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.949436 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:58 crc kubenswrapper[4735]: I0223 00:08:58.949454 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:58Z","lastTransitionTime":"2026-02-23T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.052828 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.052921 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.052939 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.052963 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.052982 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:59Z","lastTransitionTime":"2026-02-23T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.156199 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.156260 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.156279 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.156303 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.156321 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:59Z","lastTransitionTime":"2026-02-23T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.260122 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.260190 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.260207 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.260235 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.260257 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:59Z","lastTransitionTime":"2026-02-23T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.271503 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.271602 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:08:59 crc kubenswrapper[4735]: E0223 00:08:59.271972 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:08:59 crc kubenswrapper[4735]: E0223 00:08:59.271813 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.363161 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.363214 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.363230 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.363251 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.363266 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:59Z","lastTransitionTime":"2026-02-23T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.466674 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.467181 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.467441 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.467679 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.467840 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:59Z","lastTransitionTime":"2026-02-23T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.571733 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.572574 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.572931 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.573098 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.573250 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:59Z","lastTransitionTime":"2026-02-23T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.582266 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 19:43:11.961248332 +0000 UTC Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.675931 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.676010 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.676034 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.676063 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.676119 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:59Z","lastTransitionTime":"2026-02-23T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.778832 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.779392 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.779554 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.779705 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.779827 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:59Z","lastTransitionTime":"2026-02-23T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.876410 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.876512 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.876533 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.876557 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.876743 4735 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-23T00:08:59Z","lastTransitionTime":"2026-02-23T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.936384 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-7lzpb"] Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.936909 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7lzpb" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.940319 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.940610 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.940723 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 23 00:08:59 crc kubenswrapper[4735]: I0223 00:08:59.941076 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 23 00:09:00 crc kubenswrapper[4735]: I0223 00:09:00.100484 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/349beb1d-a40b-4758-b361-68b767fc24d8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-7lzpb\" (UID: \"349beb1d-a40b-4758-b361-68b767fc24d8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7lzpb" Feb 23 00:09:00 crc kubenswrapper[4735]: I0223 00:09:00.101027 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/349beb1d-a40b-4758-b361-68b767fc24d8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-7lzpb\" (UID: \"349beb1d-a40b-4758-b361-68b767fc24d8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7lzpb" Feb 23 00:09:00 crc kubenswrapper[4735]: I0223 00:09:00.101245 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/349beb1d-a40b-4758-b361-68b767fc24d8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-7lzpb\" (UID: \"349beb1d-a40b-4758-b361-68b767fc24d8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7lzpb" Feb 23 00:09:00 crc kubenswrapper[4735]: I0223 00:09:00.101422 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/349beb1d-a40b-4758-b361-68b767fc24d8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-7lzpb\" (UID: \"349beb1d-a40b-4758-b361-68b767fc24d8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7lzpb" Feb 23 00:09:00 crc kubenswrapper[4735]: I0223 00:09:00.101618 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/349beb1d-a40b-4758-b361-68b767fc24d8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-7lzpb\" (UID: \"349beb1d-a40b-4758-b361-68b767fc24d8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7lzpb" Feb 23 00:09:00 crc kubenswrapper[4735]: I0223 00:09:00.202982 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/349beb1d-a40b-4758-b361-68b767fc24d8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-7lzpb\" (UID: \"349beb1d-a40b-4758-b361-68b767fc24d8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7lzpb" Feb 23 00:09:00 crc kubenswrapper[4735]: I0223 00:09:00.203346 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/349beb1d-a40b-4758-b361-68b767fc24d8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-7lzpb\" (UID: \"349beb1d-a40b-4758-b361-68b767fc24d8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7lzpb" Feb 23 00:09:00 crc kubenswrapper[4735]: I0223 00:09:00.203512 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/349beb1d-a40b-4758-b361-68b767fc24d8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-7lzpb\" (UID: \"349beb1d-a40b-4758-b361-68b767fc24d8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7lzpb" Feb 23 00:09:00 crc kubenswrapper[4735]: I0223 00:09:00.203726 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/349beb1d-a40b-4758-b361-68b767fc24d8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-7lzpb\" (UID: \"349beb1d-a40b-4758-b361-68b767fc24d8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7lzpb" Feb 23 00:09:00 crc kubenswrapper[4735]: I0223 00:09:00.203941 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/349beb1d-a40b-4758-b361-68b767fc24d8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-7lzpb\" (UID: \"349beb1d-a40b-4758-b361-68b767fc24d8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7lzpb" Feb 23 00:09:00 crc kubenswrapper[4735]: I0223 00:09:00.203616 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/349beb1d-a40b-4758-b361-68b767fc24d8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-7lzpb\" (UID: \"349beb1d-a40b-4758-b361-68b767fc24d8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7lzpb" Feb 23 00:09:00 crc kubenswrapper[4735]: I0223 00:09:00.203497 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/349beb1d-a40b-4758-b361-68b767fc24d8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-7lzpb\" (UID: \"349beb1d-a40b-4758-b361-68b767fc24d8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7lzpb" Feb 23 00:09:00 crc kubenswrapper[4735]: I0223 00:09:00.204570 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/349beb1d-a40b-4758-b361-68b767fc24d8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-7lzpb\" (UID: \"349beb1d-a40b-4758-b361-68b767fc24d8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7lzpb" Feb 23 00:09:00 crc kubenswrapper[4735]: I0223 00:09:00.213649 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/349beb1d-a40b-4758-b361-68b767fc24d8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-7lzpb\" (UID: \"349beb1d-a40b-4758-b361-68b767fc24d8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7lzpb" Feb 23 00:09:00 crc kubenswrapper[4735]: I0223 00:09:00.234100 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/349beb1d-a40b-4758-b361-68b767fc24d8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-7lzpb\" (UID: \"349beb1d-a40b-4758-b361-68b767fc24d8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7lzpb" Feb 23 00:09:00 crc kubenswrapper[4735]: I0223 00:09:00.257439 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7lzpb" Feb 23 00:09:00 crc kubenswrapper[4735]: I0223 00:09:00.272476 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:09:00 crc kubenswrapper[4735]: E0223 00:09:00.272651 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:09:00 crc kubenswrapper[4735]: I0223 00:09:00.273035 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:09:00 crc kubenswrapper[4735]: E0223 00:09:00.273798 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:09:00 crc kubenswrapper[4735]: I0223 00:09:00.274374 4735 scope.go:117] "RemoveContainer" containerID="dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33" Feb 23 00:09:00 crc kubenswrapper[4735]: E0223 00:09:00.274884 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-59rkm_openshift-ovn-kubernetes(66853c8a-9391-4291-b5f1-c72cb5fe23e8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" Feb 23 00:09:00 crc kubenswrapper[4735]: I0223 00:09:00.583481 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 05:40:35.380181794 +0000 UTC Feb 23 00:09:00 crc kubenswrapper[4735]: I0223 00:09:00.583592 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 23 00:09:00 crc kubenswrapper[4735]: I0223 00:09:00.595409 4735 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 23 00:09:00 crc kubenswrapper[4735]: I0223 00:09:00.705761 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7lzpb" event={"ID":"349beb1d-a40b-4758-b361-68b767fc24d8","Type":"ContainerStarted","Data":"f0ddabcab08726686961f9264f1c5eade8454f0f1bd6600719f2136fe3e6bac5"} Feb 23 00:09:00 crc kubenswrapper[4735]: I0223 00:09:00.705822 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7lzpb" event={"ID":"349beb1d-a40b-4758-b361-68b767fc24d8","Type":"ContainerStarted","Data":"3c90f578c7ca306b04d5334fd352346f084c1c6566d5b5f07df613ea906ff261"} Feb 23 00:09:00 crc kubenswrapper[4735]: I0223 00:09:00.728553 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-7lzpb" podStartSLOduration=72.728522999 podStartE2EDuration="1m12.728522999s" podCreationTimestamp="2026-02-23 00:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:00.726633894 +0000 UTC m=+99.190179895" watchObservedRunningTime="2026-02-23 00:09:00.728522999 +0000 UTC m=+99.192069010" Feb 23 00:09:01 crc kubenswrapper[4735]: I0223 00:09:01.271799 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:09:01 crc kubenswrapper[4735]: I0223 00:09:01.271828 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:09:01 crc kubenswrapper[4735]: E0223 00:09:01.272112 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:09:01 crc kubenswrapper[4735]: E0223 00:09:01.272377 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:09:02 crc kubenswrapper[4735]: I0223 00:09:02.271217 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:09:02 crc kubenswrapper[4735]: I0223 00:09:02.271298 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:09:02 crc kubenswrapper[4735]: E0223 00:09:02.273175 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:09:02 crc kubenswrapper[4735]: E0223 00:09:02.273389 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:09:03 crc kubenswrapper[4735]: I0223 00:09:03.271438 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:09:03 crc kubenswrapper[4735]: E0223 00:09:03.271624 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:09:03 crc kubenswrapper[4735]: I0223 00:09:03.271983 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:09:03 crc kubenswrapper[4735]: E0223 00:09:03.272106 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:09:04 crc kubenswrapper[4735]: I0223 00:09:04.271508 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:09:04 crc kubenswrapper[4735]: I0223 00:09:04.271648 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:09:04 crc kubenswrapper[4735]: E0223 00:09:04.271824 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:09:04 crc kubenswrapper[4735]: E0223 00:09:04.272161 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:09:05 crc kubenswrapper[4735]: I0223 00:09:05.271350 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:09:05 crc kubenswrapper[4735]: I0223 00:09:05.271510 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:09:05 crc kubenswrapper[4735]: E0223 00:09:05.272011 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:09:05 crc kubenswrapper[4735]: E0223 00:09:05.272280 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:09:06 crc kubenswrapper[4735]: I0223 00:09:06.271954 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:09:06 crc kubenswrapper[4735]: I0223 00:09:06.272098 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:09:06 crc kubenswrapper[4735]: E0223 00:09:06.272266 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:09:06 crc kubenswrapper[4735]: E0223 00:09:06.272435 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:09:07 crc kubenswrapper[4735]: I0223 00:09:07.271826 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:09:07 crc kubenswrapper[4735]: I0223 00:09:07.271832 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:09:07 crc kubenswrapper[4735]: E0223 00:09:07.272068 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:09:07 crc kubenswrapper[4735]: E0223 00:09:07.272196 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:09:07 crc kubenswrapper[4735]: I0223 00:09:07.995025 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b542cb9e-35cc-44d9-a850-c41887636c4c-metrics-certs\") pod \"network-metrics-daemon-bdqfd\" (UID: \"b542cb9e-35cc-44d9-a850-c41887636c4c\") " pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:09:07 crc kubenswrapper[4735]: E0223 00:09:07.995303 4735 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 00:09:07 crc kubenswrapper[4735]: E0223 00:09:07.995383 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b542cb9e-35cc-44d9-a850-c41887636c4c-metrics-certs podName:b542cb9e-35cc-44d9-a850-c41887636c4c nodeName:}" failed. No retries permitted until 2026-02-23 00:10:11.995359216 +0000 UTC m=+170.458905227 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b542cb9e-35cc-44d9-a850-c41887636c4c-metrics-certs") pod "network-metrics-daemon-bdqfd" (UID: "b542cb9e-35cc-44d9-a850-c41887636c4c") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 23 00:09:08 crc kubenswrapper[4735]: I0223 00:09:08.272412 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:09:08 crc kubenswrapper[4735]: I0223 00:09:08.273379 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:09:08 crc kubenswrapper[4735]: E0223 00:09:08.273727 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:09:08 crc kubenswrapper[4735]: E0223 00:09:08.273876 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:09:09 crc kubenswrapper[4735]: I0223 00:09:09.272051 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:09:09 crc kubenswrapper[4735]: E0223 00:09:09.272255 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:09:09 crc kubenswrapper[4735]: I0223 00:09:09.273258 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:09:09 crc kubenswrapper[4735]: E0223 00:09:09.273561 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:09:10 crc kubenswrapper[4735]: I0223 00:09:10.271544 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:09:10 crc kubenswrapper[4735]: E0223 00:09:10.271709 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:09:10 crc kubenswrapper[4735]: I0223 00:09:10.272010 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:09:10 crc kubenswrapper[4735]: E0223 00:09:10.272116 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:09:11 crc kubenswrapper[4735]: I0223 00:09:11.271951 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:09:11 crc kubenswrapper[4735]: I0223 00:09:11.271992 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:09:11 crc kubenswrapper[4735]: E0223 00:09:11.272141 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:09:11 crc kubenswrapper[4735]: E0223 00:09:11.272241 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:09:12 crc kubenswrapper[4735]: I0223 00:09:12.271158 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:09:12 crc kubenswrapper[4735]: I0223 00:09:12.271231 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:09:12 crc kubenswrapper[4735]: E0223 00:09:12.273234 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:09:12 crc kubenswrapper[4735]: E0223 00:09:12.273345 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:09:13 crc kubenswrapper[4735]: I0223 00:09:13.271763 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:09:13 crc kubenswrapper[4735]: I0223 00:09:13.271903 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:09:13 crc kubenswrapper[4735]: E0223 00:09:13.272347 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:09:13 crc kubenswrapper[4735]: E0223 00:09:13.272904 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:09:14 crc kubenswrapper[4735]: I0223 00:09:14.272164 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:09:14 crc kubenswrapper[4735]: I0223 00:09:14.272247 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:09:14 crc kubenswrapper[4735]: E0223 00:09:14.273116 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:09:14 crc kubenswrapper[4735]: E0223 00:09:14.273347 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:09:15 crc kubenswrapper[4735]: I0223 00:09:15.271137 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:09:15 crc kubenswrapper[4735]: I0223 00:09:15.271234 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:09:15 crc kubenswrapper[4735]: E0223 00:09:15.271332 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:09:15 crc kubenswrapper[4735]: E0223 00:09:15.271450 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:09:15 crc kubenswrapper[4735]: I0223 00:09:15.272616 4735 scope.go:117] "RemoveContainer" containerID="dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33" Feb 23 00:09:15 crc kubenswrapper[4735]: E0223 00:09:15.272950 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-59rkm_openshift-ovn-kubernetes(66853c8a-9391-4291-b5f1-c72cb5fe23e8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" Feb 23 00:09:16 crc kubenswrapper[4735]: I0223 00:09:16.271379 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:09:16 crc kubenswrapper[4735]: I0223 00:09:16.271458 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:09:16 crc kubenswrapper[4735]: E0223 00:09:16.271582 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:09:16 crc kubenswrapper[4735]: E0223 00:09:16.272281 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:09:17 crc kubenswrapper[4735]: I0223 00:09:17.271824 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:09:17 crc kubenswrapper[4735]: I0223 00:09:17.271911 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:09:17 crc kubenswrapper[4735]: E0223 00:09:17.272195 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:09:17 crc kubenswrapper[4735]: E0223 00:09:17.272373 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:09:18 crc kubenswrapper[4735]: I0223 00:09:18.272138 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:09:18 crc kubenswrapper[4735]: I0223 00:09:18.272283 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:09:18 crc kubenswrapper[4735]: E0223 00:09:18.272348 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:09:18 crc kubenswrapper[4735]: E0223 00:09:18.272496 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:09:19 crc kubenswrapper[4735]: I0223 00:09:19.272066 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:09:19 crc kubenswrapper[4735]: I0223 00:09:19.272066 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:09:19 crc kubenswrapper[4735]: E0223 00:09:19.272355 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:09:19 crc kubenswrapper[4735]: E0223 00:09:19.272540 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:09:20 crc kubenswrapper[4735]: I0223 00:09:20.271238 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:09:20 crc kubenswrapper[4735]: I0223 00:09:20.271285 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:09:20 crc kubenswrapper[4735]: E0223 00:09:20.271464 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:09:20 crc kubenswrapper[4735]: E0223 00:09:20.271702 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:09:21 crc kubenswrapper[4735]: I0223 00:09:21.271367 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:09:21 crc kubenswrapper[4735]: I0223 00:09:21.271384 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:09:21 crc kubenswrapper[4735]: E0223 00:09:21.271525 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:09:21 crc kubenswrapper[4735]: E0223 00:09:21.271652 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:09:22 crc kubenswrapper[4735]: E0223 00:09:22.213778 4735 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 23 00:09:22 crc kubenswrapper[4735]: I0223 00:09:22.271114 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:09:22 crc kubenswrapper[4735]: I0223 00:09:22.271176 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:09:22 crc kubenswrapper[4735]: E0223 00:09:22.273194 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:09:22 crc kubenswrapper[4735]: E0223 00:09:22.273344 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:09:22 crc kubenswrapper[4735]: E0223 00:09:22.386800 4735 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 00:09:23 crc kubenswrapper[4735]: I0223 00:09:23.272196 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:09:23 crc kubenswrapper[4735]: I0223 00:09:23.272275 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:09:23 crc kubenswrapper[4735]: E0223 00:09:23.272373 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:09:23 crc kubenswrapper[4735]: E0223 00:09:23.272588 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:09:23 crc kubenswrapper[4735]: I0223 00:09:23.789651 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4gvxr_5b63c18f-b6b2-4d97-b542-7800b475bd4c/kube-multus/1.log" Feb 23 00:09:23 crc kubenswrapper[4735]: I0223 00:09:23.790430 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4gvxr_5b63c18f-b6b2-4d97-b542-7800b475bd4c/kube-multus/0.log" Feb 23 00:09:23 crc kubenswrapper[4735]: I0223 00:09:23.790495 4735 generic.go:334] "Generic (PLEG): container finished" podID="5b63c18f-b6b2-4d97-b542-7800b475bd4c" containerID="5cd4fa9902eb9e1566216eb36f85b29b30e6a4a3f687b029c77356de5620c5e0" exitCode=1 Feb 23 00:09:23 crc kubenswrapper[4735]: I0223 00:09:23.790534 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4gvxr" event={"ID":"5b63c18f-b6b2-4d97-b542-7800b475bd4c","Type":"ContainerDied","Data":"5cd4fa9902eb9e1566216eb36f85b29b30e6a4a3f687b029c77356de5620c5e0"} Feb 23 00:09:23 crc kubenswrapper[4735]: I0223 00:09:23.790583 4735 scope.go:117] "RemoveContainer" containerID="33055c975ea730c1618798b3885ea706d784d23c63c80c8fbb66ffbc27318c38" Feb 23 00:09:23 crc kubenswrapper[4735]: I0223 00:09:23.791095 4735 scope.go:117] "RemoveContainer" containerID="5cd4fa9902eb9e1566216eb36f85b29b30e6a4a3f687b029c77356de5620c5e0" Feb 23 00:09:23 crc kubenswrapper[4735]: E0223 00:09:23.791336 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-4gvxr_openshift-multus(5b63c18f-b6b2-4d97-b542-7800b475bd4c)\"" pod="openshift-multus/multus-4gvxr" podUID="5b63c18f-b6b2-4d97-b542-7800b475bd4c" Feb 23 00:09:24 crc kubenswrapper[4735]: I0223 00:09:24.271214 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:09:24 crc kubenswrapper[4735]: I0223 00:09:24.271214 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:09:24 crc kubenswrapper[4735]: E0223 00:09:24.271791 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:09:24 crc kubenswrapper[4735]: E0223 00:09:24.271608 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:09:24 crc kubenswrapper[4735]: I0223 00:09:24.797447 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4gvxr_5b63c18f-b6b2-4d97-b542-7800b475bd4c/kube-multus/1.log" Feb 23 00:09:25 crc kubenswrapper[4735]: I0223 00:09:25.271946 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:09:25 crc kubenswrapper[4735]: E0223 00:09:25.272152 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:09:25 crc kubenswrapper[4735]: I0223 00:09:25.271941 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:09:25 crc kubenswrapper[4735]: E0223 00:09:25.272421 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:09:26 crc kubenswrapper[4735]: I0223 00:09:26.271533 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:09:26 crc kubenswrapper[4735]: I0223 00:09:26.271549 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:09:26 crc kubenswrapper[4735]: E0223 00:09:26.271826 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:09:26 crc kubenswrapper[4735]: E0223 00:09:26.271971 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:09:27 crc kubenswrapper[4735]: I0223 00:09:27.272067 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:09:27 crc kubenswrapper[4735]: E0223 00:09:27.272285 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:09:27 crc kubenswrapper[4735]: I0223 00:09:27.272774 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:09:27 crc kubenswrapper[4735]: E0223 00:09:27.272947 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:09:27 crc kubenswrapper[4735]: E0223 00:09:27.402791 4735 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 00:09:28 crc kubenswrapper[4735]: I0223 00:09:28.271943 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:09:28 crc kubenswrapper[4735]: I0223 00:09:28.272188 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:09:28 crc kubenswrapper[4735]: E0223 00:09:28.272271 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:09:28 crc kubenswrapper[4735]: E0223 00:09:28.272555 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:09:29 crc kubenswrapper[4735]: I0223 00:09:29.272017 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:09:29 crc kubenswrapper[4735]: E0223 00:09:29.272214 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:09:29 crc kubenswrapper[4735]: I0223 00:09:29.272267 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:09:29 crc kubenswrapper[4735]: E0223 00:09:29.272549 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:09:30 crc kubenswrapper[4735]: I0223 00:09:30.271946 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:09:30 crc kubenswrapper[4735]: E0223 00:09:30.272137 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:09:30 crc kubenswrapper[4735]: I0223 00:09:30.272385 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:09:30 crc kubenswrapper[4735]: E0223 00:09:30.272819 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:09:30 crc kubenswrapper[4735]: I0223 00:09:30.273756 4735 scope.go:117] "RemoveContainer" containerID="dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33" Feb 23 00:09:30 crc kubenswrapper[4735]: I0223 00:09:30.824610 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-59rkm_66853c8a-9391-4291-b5f1-c72cb5fe23e8/ovnkube-controller/3.log" Feb 23 00:09:30 crc kubenswrapper[4735]: I0223 00:09:30.827437 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" event={"ID":"66853c8a-9391-4291-b5f1-c72cb5fe23e8","Type":"ContainerStarted","Data":"d86577bcf7609c1352ab606dcac0b9e98198dfdfa5e9d9d98e8aaad6ce7f2e1e"} Feb 23 00:09:30 crc kubenswrapper[4735]: I0223 00:09:30.829094 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:09:30 crc kubenswrapper[4735]: I0223 00:09:30.875715 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" podStartSLOduration=102.875701417 podStartE2EDuration="1m42.875701417s" podCreationTimestamp="2026-02-23 00:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:30.8745615 +0000 UTC m=+129.338107481" watchObservedRunningTime="2026-02-23 00:09:30.875701417 +0000 UTC m=+129.339247388" Feb 23 00:09:31 crc kubenswrapper[4735]: I0223 00:09:31.271947 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:09:31 crc kubenswrapper[4735]: I0223 00:09:31.272014 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:09:31 crc kubenswrapper[4735]: E0223 00:09:31.272214 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:09:31 crc kubenswrapper[4735]: E0223 00:09:31.272421 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:09:31 crc kubenswrapper[4735]: I0223 00:09:31.307025 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bdqfd"] Feb 23 00:09:31 crc kubenswrapper[4735]: I0223 00:09:31.831027 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:09:31 crc kubenswrapper[4735]: E0223 00:09:31.831163 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:09:32 crc kubenswrapper[4735]: I0223 00:09:32.272090 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:09:32 crc kubenswrapper[4735]: I0223 00:09:32.272159 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:09:32 crc kubenswrapper[4735]: E0223 00:09:32.273330 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:09:32 crc kubenswrapper[4735]: E0223 00:09:32.273484 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:09:32 crc kubenswrapper[4735]: E0223 00:09:32.403496 4735 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 23 00:09:33 crc kubenswrapper[4735]: I0223 00:09:33.272146 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:09:33 crc kubenswrapper[4735]: I0223 00:09:33.272186 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:09:33 crc kubenswrapper[4735]: E0223 00:09:33.272360 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:09:33 crc kubenswrapper[4735]: E0223 00:09:33.272497 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:09:34 crc kubenswrapper[4735]: I0223 00:09:34.272017 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:09:34 crc kubenswrapper[4735]: E0223 00:09:34.272208 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:09:34 crc kubenswrapper[4735]: I0223 00:09:34.272280 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:09:34 crc kubenswrapper[4735]: E0223 00:09:34.272762 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:09:34 crc kubenswrapper[4735]: I0223 00:09:34.272838 4735 scope.go:117] "RemoveContainer" containerID="5cd4fa9902eb9e1566216eb36f85b29b30e6a4a3f687b029c77356de5620c5e0" Feb 23 00:09:34 crc kubenswrapper[4735]: I0223 00:09:34.853415 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4gvxr_5b63c18f-b6b2-4d97-b542-7800b475bd4c/kube-multus/1.log" Feb 23 00:09:34 crc kubenswrapper[4735]: I0223 00:09:34.853633 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4gvxr" event={"ID":"5b63c18f-b6b2-4d97-b542-7800b475bd4c","Type":"ContainerStarted","Data":"89011481e42515946009b35bf0cf23e12f73377615e05b69a8471c4968e6bc01"} Feb 23 00:09:35 crc kubenswrapper[4735]: I0223 00:09:35.271360 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:09:35 crc kubenswrapper[4735]: I0223 00:09:35.271426 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:09:35 crc kubenswrapper[4735]: E0223 00:09:35.271924 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:09:35 crc kubenswrapper[4735]: E0223 00:09:35.272052 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:09:36 crc kubenswrapper[4735]: I0223 00:09:36.271990 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:09:36 crc kubenswrapper[4735]: I0223 00:09:36.272070 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:09:36 crc kubenswrapper[4735]: E0223 00:09:36.272161 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 23 00:09:36 crc kubenswrapper[4735]: E0223 00:09:36.272248 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 23 00:09:37 crc kubenswrapper[4735]: I0223 00:09:37.271221 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:09:37 crc kubenswrapper[4735]: I0223 00:09:37.271284 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:09:37 crc kubenswrapper[4735]: E0223 00:09:37.271433 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bdqfd" podUID="b542cb9e-35cc-44d9-a850-c41887636c4c" Feb 23 00:09:37 crc kubenswrapper[4735]: E0223 00:09:37.271566 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 23 00:09:38 crc kubenswrapper[4735]: I0223 00:09:38.272143 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:09:38 crc kubenswrapper[4735]: I0223 00:09:38.272283 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:09:38 crc kubenswrapper[4735]: I0223 00:09:38.276716 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 23 00:09:38 crc kubenswrapper[4735]: I0223 00:09:38.277307 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 23 00:09:38 crc kubenswrapper[4735]: I0223 00:09:38.277317 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 23 00:09:38 crc kubenswrapper[4735]: I0223 00:09:38.278429 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 23 00:09:39 crc kubenswrapper[4735]: I0223 00:09:39.271965 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:09:39 crc kubenswrapper[4735]: I0223 00:09:39.272000 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:09:39 crc kubenswrapper[4735]: I0223 00:09:39.274351 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 23 00:09:39 crc kubenswrapper[4735]: I0223 00:09:39.274573 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.435014 4735 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.488979 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jqwls"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.489581 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jqwls" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.494513 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-4mwf4"] Feb 23 00:09:40 crc kubenswrapper[4735]: W0223 00:09:40.495328 4735 reflector.go:561] object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv": failed to list *v1.Secret: secrets "openshift-apiserver-operator-dockercfg-xtcjv" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver-operator": no relationship found between node 'crc' and this object Feb 23 00:09:40 crc kubenswrapper[4735]: E0223 00:09:40.495432 4735 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver-operator\"/\"openshift-apiserver-operator-dockercfg-xtcjv\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-apiserver-operator-dockercfg-xtcjv\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.495373 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-4mwf4" Feb 23 00:09:40 crc kubenswrapper[4735]: W0223 00:09:40.500232 4735 reflector.go:561] object-"openshift-apiserver-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver-operator": no relationship found between node 'crc' and this object Feb 23 00:09:40 crc kubenswrapper[4735]: E0223 00:09:40.500294 4735 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver-operator\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 23 00:09:40 crc kubenswrapper[4735]: W0223 00:09:40.500311 4735 reflector.go:561] object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert": failed to list *v1.Secret: secrets "openshift-apiserver-operator-serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver-operator": no relationship found between node 'crc' and this object Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.500337 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 23 00:09:40 crc kubenswrapper[4735]: E0223 00:09:40.500394 4735 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver-operator\"/\"openshift-apiserver-operator-serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-apiserver-operator-serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 23 00:09:40 crc kubenswrapper[4735]: W0223 00:09:40.500764 4735 reflector.go:561] object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config": failed to list *v1.ConfigMap: configmaps "openshift-apiserver-operator-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver-operator": no relationship found between node 'crc' and this object Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.501886 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 23 00:09:40 crc kubenswrapper[4735]: E0223 00:09:40.501968 4735 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver-operator\"/\"openshift-apiserver-operator-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-apiserver-operator-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.503078 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tjkjf"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.503912 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tjkjf" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.506190 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6xqc"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.507203 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6xqc" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.507780 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-htb9k"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.508599 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-htb9k" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.515037 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.515523 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.518716 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-nhtzb"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.519382 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-nhtzb" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.519487 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-87xcv"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.519831 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.520079 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.520239 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.520888 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.521029 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.521207 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.521316 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.521340 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.521541 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.521607 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.521549 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.521670 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.521261 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.520891 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.521897 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.521611 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.522023 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.521474 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.522146 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.521486 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.521221 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.522409 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.522684 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jvnk7"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.522841 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.523181 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.524098 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-dsqjf"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.524452 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-dsqjf" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.527045 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2mld6"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.527434 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.527633 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-g5xff"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.528064 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xgbf2"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.528300 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-2mld6" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.528545 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-xgbf2" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.529160 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-g5xff" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.537942 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lnz4f"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.538476 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lnz4f" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.554182 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.555048 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.555346 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.555668 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.556259 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.556281 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.556495 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.557563 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.557638 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.557576 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.558008 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.557611 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.555691 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.558199 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.554360 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.558580 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.559262 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.568622 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.569234 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.569920 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.570212 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.572051 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.572056 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.572269 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.572689 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/defd5be8-af50-4f87-a9a5-c166a9e3ce44-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jqwls\" (UID: \"defd5be8-af50-4f87-a9a5-c166a9e3ce44\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jqwls" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.572732 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbv74\" (UniqueName: \"kubernetes.io/projected/defd5be8-af50-4f87-a9a5-c166a9e3ce44-kube-api-access-fbv74\") pod \"openshift-apiserver-operator-796bbdcf4f-jqwls\" (UID: \"defd5be8-af50-4f87-a9a5-c166a9e3ce44\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jqwls" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.572759 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khfnz\" (UniqueName: \"kubernetes.io/projected/f0643721-5a54-4c37-b857-474ed61ef531-kube-api-access-khfnz\") pod \"machine-api-operator-5694c8668f-4mwf4\" (UID: \"f0643721-5a54-4c37-b857-474ed61ef531\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4mwf4" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.572801 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f0643721-5a54-4c37-b857-474ed61ef531-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-4mwf4\" (UID: \"f0643721-5a54-4c37-b857-474ed61ef531\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4mwf4" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.572820 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0643721-5a54-4c37-b857-474ed61ef531-config\") pod \"machine-api-operator-5694c8668f-4mwf4\" (UID: \"f0643721-5a54-4c37-b857-474ed61ef531\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4mwf4" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.572816 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qzhgh"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.572841 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/defd5be8-af50-4f87-a9a5-c166a9e3ce44-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jqwls\" (UID: \"defd5be8-af50-4f87-a9a5-c166a9e3ce44\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jqwls" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.572891 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f0643721-5a54-4c37-b857-474ed61ef531-images\") pod \"machine-api-operator-5694c8668f-4mwf4\" (UID: \"f0643721-5a54-4c37-b857-474ed61ef531\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4mwf4" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.573357 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29530080-8vgj6"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.573383 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.573438 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qzhgh" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.573555 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.573893 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29530080-8vgj6" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.574481 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.574507 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.574695 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.574849 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.575005 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.575201 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.575247 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.575310 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.575404 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.575438 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.575506 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.575612 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.575703 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.575870 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.576033 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.576156 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.576263 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.576399 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.576440 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.576507 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.576586 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.576654 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.576606 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.576762 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.576892 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.577070 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.577134 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.577080 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wtlds"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.577639 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wtlds" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.582426 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fpbc2"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.582655 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.582901 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.583042 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fpbc2" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.583827 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.584092 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.585682 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xmz5p"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.585957 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.586175 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xmz5p" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.586701 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.587547 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-qnx85"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.588073 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qnx85" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.589588 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.592344 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.594105 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.597021 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.597508 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.599925 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.600042 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-d2dh9"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.600696 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-d2dh9" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.600797 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.601134 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.601818 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.601834 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-9xsmt"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.617297 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6ggwm"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.617714 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.625996 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9xsmt" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.640269 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.640542 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.640840 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-chd9d"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.641191 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6ggwm" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.641560 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-chd9d" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.641814 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.642005 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.642216 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.642251 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6lsk2"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.643079 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.643373 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lftlx"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.644018 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-lftlx" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.645080 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wgw78"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.645667 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l5546"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.645706 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.646103 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wgw78" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.646153 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l5546" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.647412 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.648937 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x959d"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.649705 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x959d" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.650303 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-h9hzt"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.650928 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-kkmzn"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.651044 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-h9hzt" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.651306 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-kkmzn" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.651534 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-z7rvn"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.652086 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7rvn" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.652090 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.652902 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-f77lm"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.653400 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-f77lm" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.655135 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-z22gj"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.655688 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z22gj" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.656101 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kkhmq"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.656784 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kkhmq" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.656978 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530080-85k47"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.657391 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530080-85k47" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.658103 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvfx4"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.659002 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvfx4" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.659422 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8wg7"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.659830 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8wg7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.660579 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-knx8j"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.661147 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-knx8j" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.661497 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jqwls"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.662705 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tjkjf"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.663847 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-4mwf4"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.664803 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qzhgh"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.666001 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-g5xff"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.667483 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2mld6"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.668551 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xgbf2"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.669371 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-8g4g7"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.669934 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8g4g7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.672564 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.672874 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29530080-8vgj6"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.672898 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wtlds"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.673372 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f0643721-5a54-4c37-b857-474ed61ef531-images\") pod \"machine-api-operator-5694c8668f-4mwf4\" (UID: \"f0643721-5a54-4c37-b857-474ed61ef531\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4mwf4" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.673501 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25de1778-6e9d-4158-9aa4-02e1607d7f45-config\") pod \"route-controller-manager-6576b87f9c-c6xqc\" (UID: \"25de1778-6e9d-4158-9aa4-02e1607d7f45\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6xqc" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.673588 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e0c2b77e-cacd-4257-b3c1-699bd3f4261f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lnz4f\" (UID: \"e0c2b77e-cacd-4257-b3c1-699bd3f4261f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lnz4f" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.679591 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fpbc2"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.679626 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-nhtzb"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.679636 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6xqc"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.680273 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f0643721-5a54-4c37-b857-474ed61ef531-images\") pod \"machine-api-operator-5694c8668f-4mwf4\" (UID: \"f0643721-5a54-4c37-b857-474ed61ef531\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4mwf4" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.684310 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/92d294e9-091b-420f-aae8-da3bcab119e4-service-ca\") pod \"console-f9d7485db-g5xff\" (UID: \"92d294e9-091b-420f-aae8-da3bcab119e4\") " pod="openshift-console/console-f9d7485db-g5xff" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.684446 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a8e73ee2-9e80-4123-bece-02008233348f-etcd-client\") pod \"etcd-operator-b45778765-xgbf2\" (UID: \"a8e73ee2-9e80-4123-bece-02008233348f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xgbf2" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.684586 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6981202b-1a1f-4b0c-b68e-6a45d9914fc8-service-ca-bundle\") pod \"authentication-operator-69f744f599-nhtzb\" (UID: \"6981202b-1a1f-4b0c-b68e-6a45d9914fc8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nhtzb" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.684695 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a14dd71a-cfac-4f4e-9cbf-89599011d970-config\") pod \"apiserver-76f77b778f-87xcv\" (UID: \"a14dd71a-cfac-4f4e-9cbf-89599011d970\") " pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.684805 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/792fd1c0-de50-429b-89ff-6a3f64541e29-encryption-config\") pod \"apiserver-7bbb656c7d-g2t27\" (UID: \"792fd1c0-de50-429b-89ff-6a3f64541e29\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.685014 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7rsp\" (UniqueName: \"kubernetes.io/projected/b40b3484-891d-4669-838b-90c1b8d6869e-kube-api-access-l7rsp\") pod \"downloads-7954f5f757-dsqjf\" (UID: \"b40b3484-891d-4669-838b-90c1b8d6869e\") " pod="openshift-console/downloads-7954f5f757-dsqjf" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.685113 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7r77\" (UniqueName: \"kubernetes.io/projected/f3bbf6e9-0c6a-4d4a-9d35-25f941df2308-kube-api-access-g7r77\") pod \"console-operator-58897d9998-2mld6\" (UID: \"f3bbf6e9-0c6a-4d4a-9d35-25f941df2308\") " pod="openshift-console-operator/console-operator-58897d9998-2mld6" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.685275 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a8e73ee2-9e80-4123-bece-02008233348f-etcd-service-ca\") pod \"etcd-operator-b45778765-xgbf2\" (UID: \"a8e73ee2-9e80-4123-bece-02008233348f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xgbf2" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.685362 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.685465 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.685562 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2b8t\" (UniqueName: \"kubernetes.io/projected/792fd1c0-de50-429b-89ff-6a3f64541e29-kube-api-access-h2b8t\") pod \"apiserver-7bbb656c7d-g2t27\" (UID: \"792fd1c0-de50-429b-89ff-6a3f64541e29\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.685677 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a8e73ee2-9e80-4123-bece-02008233348f-etcd-ca\") pod \"etcd-operator-b45778765-xgbf2\" (UID: \"a8e73ee2-9e80-4123-bece-02008233348f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xgbf2" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.685766 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lftlx"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.685804 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-9xsmt"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.685815 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-kkmzn"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.685818 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a14dd71a-cfac-4f4e-9cbf-89599011d970-trusted-ca-bundle\") pod \"apiserver-76f77b778f-87xcv\" (UID: \"a14dd71a-cfac-4f4e-9cbf-89599011d970\") " pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.687209 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a14dd71a-cfac-4f4e-9cbf-89599011d970-audit-dir\") pod \"apiserver-76f77b778f-87xcv\" (UID: \"a14dd71a-cfac-4f4e-9cbf-89599011d970\") " pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.687278 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/defd5be8-af50-4f87-a9a5-c166a9e3ce44-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jqwls\" (UID: \"defd5be8-af50-4f87-a9a5-c166a9e3ce44\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jqwls" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.687313 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6981202b-1a1f-4b0c-b68e-6a45d9914fc8-config\") pod \"authentication-operator-69f744f599-nhtzb\" (UID: \"6981202b-1a1f-4b0c-b68e-6a45d9914fc8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nhtzb" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.687367 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a14dd71a-cfac-4f4e-9cbf-89599011d970-node-pullsecrets\") pod \"apiserver-76f77b778f-87xcv\" (UID: \"a14dd71a-cfac-4f4e-9cbf-89599011d970\") " pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.687420 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58844231-adcc-497d-83a3-bba779038cc2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fpbc2\" (UID: \"58844231-adcc-497d-83a3-bba779038cc2\") " pod="openshift-marketplace/marketplace-operator-79b997595-fpbc2" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.687454 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/792fd1c0-de50-429b-89ff-6a3f64541e29-etcd-client\") pod \"apiserver-7bbb656c7d-g2t27\" (UID: \"792fd1c0-de50-429b-89ff-6a3f64541e29\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.687491 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp44v\" (UniqueName: \"kubernetes.io/projected/a14dd71a-cfac-4f4e-9cbf-89599011d970-kube-api-access-zp44v\") pod \"apiserver-76f77b778f-87xcv\" (UID: \"a14dd71a-cfac-4f4e-9cbf-89599011d970\") " pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.687519 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.687561 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khfnz\" (UniqueName: \"kubernetes.io/projected/f0643721-5a54-4c37-b857-474ed61ef531-kube-api-access-khfnz\") pod \"machine-api-operator-5694c8668f-4mwf4\" (UID: \"f0643721-5a54-4c37-b857-474ed61ef531\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4mwf4" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.687591 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a14dd71a-cfac-4f4e-9cbf-89599011d970-serving-cert\") pod \"apiserver-76f77b778f-87xcv\" (UID: \"a14dd71a-cfac-4f4e-9cbf-89599011d970\") " pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.687619 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6cb942f6-2bc5-4662-929c-849ec29baddc-serviceca\") pod \"image-pruner-29530080-8vgj6\" (UID: \"6cb942f6-2bc5-4662-929c-849ec29baddc\") " pod="openshift-image-registry/image-pruner-29530080-8vgj6" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.687651 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp6c9\" (UniqueName: \"kubernetes.io/projected/06270ded-7d6b-4966-9f10-a432c593bdfe-kube-api-access-mp6c9\") pod \"machine-approver-56656f9798-htb9k\" (UID: \"06270ded-7d6b-4966-9f10-a432c593bdfe\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-htb9k" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.687674 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/92d294e9-091b-420f-aae8-da3bcab119e4-console-oauth-config\") pod \"console-f9d7485db-g5xff\" (UID: \"92d294e9-091b-420f-aae8-da3bcab119e4\") " pod="openshift-console/console-f9d7485db-g5xff" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.687705 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/299ad63a-4584-4079-81c4-b8645326d3d0-serving-cert\") pod \"controller-manager-879f6c89f-tjkjf\" (UID: \"299ad63a-4584-4079-81c4-b8645326d3d0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tjkjf" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.687732 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drd98\" (UniqueName: \"kubernetes.io/projected/0b661fda-d14e-4491-896c-4d6812a638b5-kube-api-access-drd98\") pod \"control-plane-machine-set-operator-78cbb6b69f-xmz5p\" (UID: \"0b661fda-d14e-4491-896c-4d6812a638b5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xmz5p" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.687766 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25de1778-6e9d-4158-9aa4-02e1607d7f45-client-ca\") pod \"route-controller-manager-6576b87f9c-c6xqc\" (UID: \"25de1778-6e9d-4158-9aa4-02e1607d7f45\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6xqc" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.687799 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0643721-5a54-4c37-b857-474ed61ef531-config\") pod \"machine-api-operator-5694c8668f-4mwf4\" (UID: \"f0643721-5a54-4c37-b857-474ed61ef531\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4mwf4" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.687831 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.687882 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/92d294e9-091b-420f-aae8-da3bcab119e4-oauth-serving-cert\") pod \"console-f9d7485db-g5xff\" (UID: \"92d294e9-091b-420f-aae8-da3bcab119e4\") " pod="openshift-console/console-f9d7485db-g5xff" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.687942 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6af08a92-85a9-4200-a0d8-abff73e0e93b-serving-cert\") pod \"openshift-config-operator-7777fb866f-qnx85\" (UID: \"6af08a92-85a9-4200-a0d8-abff73e0e93b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qnx85" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.687977 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cf538d4f-1acd-4e61-9827-7430d1099138-audit-dir\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.688022 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.688049 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/06270ded-7d6b-4966-9f10-a432c593bdfe-auth-proxy-config\") pod \"machine-approver-56656f9798-htb9k\" (UID: \"06270ded-7d6b-4966-9f10-a432c593bdfe\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-htb9k" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.688077 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06270ded-7d6b-4966-9f10-a432c593bdfe-config\") pod \"machine-approver-56656f9798-htb9k\" (UID: \"06270ded-7d6b-4966-9f10-a432c593bdfe\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-htb9k" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.688107 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvdzx\" (UniqueName: \"kubernetes.io/projected/58844231-adcc-497d-83a3-bba779038cc2-kube-api-access-hvdzx\") pod \"marketplace-operator-79b997595-fpbc2\" (UID: \"58844231-adcc-497d-83a3-bba779038cc2\") " pod="openshift-marketplace/marketplace-operator-79b997595-fpbc2" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.688142 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/792fd1c0-de50-429b-89ff-6a3f64541e29-audit-dir\") pod \"apiserver-7bbb656c7d-g2t27\" (UID: \"792fd1c0-de50-429b-89ff-6a3f64541e29\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.688183 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/defd5be8-af50-4f87-a9a5-c166a9e3ce44-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jqwls\" (UID: \"defd5be8-af50-4f87-a9a5-c166a9e3ce44\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jqwls" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.688205 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.688228 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3bbf6e9-0c6a-4d4a-9d35-25f941df2308-serving-cert\") pod \"console-operator-58897d9998-2mld6\" (UID: \"f3bbf6e9-0c6a-4d4a-9d35-25f941df2308\") " pod="openshift-console-operator/console-operator-58897d9998-2mld6" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.688249 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8e73ee2-9e80-4123-bece-02008233348f-config\") pod \"etcd-operator-b45778765-xgbf2\" (UID: \"a8e73ee2-9e80-4123-bece-02008233348f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xgbf2" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.688270 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.688302 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/92d294e9-091b-420f-aae8-da3bcab119e4-console-config\") pod \"console-f9d7485db-g5xff\" (UID: \"92d294e9-091b-420f-aae8-da3bcab119e4\") " pod="openshift-console/console-f9d7485db-g5xff" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.688318 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/6af08a92-85a9-4200-a0d8-abff73e0e93b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-qnx85\" (UID: \"6af08a92-85a9-4200-a0d8-abff73e0e93b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qnx85" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.688365 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/792fd1c0-de50-429b-89ff-6a3f64541e29-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-g2t27\" (UID: \"792fd1c0-de50-429b-89ff-6a3f64541e29\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.688386 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w44v\" (UniqueName: \"kubernetes.io/projected/6af08a92-85a9-4200-a0d8-abff73e0e93b-kube-api-access-8w44v\") pod \"openshift-config-operator-7777fb866f-qnx85\" (UID: \"6af08a92-85a9-4200-a0d8-abff73e0e93b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qnx85" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.688432 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5732e921-c103-4eda-9705-afec618471c9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-wtlds\" (UID: \"5732e921-c103-4eda-9705-afec618471c9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wtlds" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.688458 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj2xf\" (UniqueName: \"kubernetes.io/projected/6cb942f6-2bc5-4662-929c-849ec29baddc-kube-api-access-pj2xf\") pod \"image-pruner-29530080-8vgj6\" (UID: \"6cb942f6-2bc5-4662-929c-849ec29baddc\") " pod="openshift-image-registry/image-pruner-29530080-8vgj6" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.688504 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6981202b-1a1f-4b0c-b68e-6a45d9914fc8-serving-cert\") pod \"authentication-operator-69f744f599-nhtzb\" (UID: \"6981202b-1a1f-4b0c-b68e-6a45d9914fc8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nhtzb" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.688538 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhprn\" (UniqueName: \"kubernetes.io/projected/8bab8f58-226d-43fd-8a33-668c4e060bfb-kube-api-access-xhprn\") pod \"openshift-controller-manager-operator-756b6f6bc6-qzhgh\" (UID: \"8bab8f58-226d-43fd-8a33-668c4e060bfb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qzhgh" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.688562 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/299ad63a-4584-4079-81c4-b8645326d3d0-config\") pod \"controller-manager-879f6c89f-tjkjf\" (UID: \"299ad63a-4584-4079-81c4-b8645326d3d0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tjkjf" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.688582 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a14dd71a-cfac-4f4e-9cbf-89599011d970-etcd-client\") pod \"apiserver-76f77b778f-87xcv\" (UID: \"a14dd71a-cfac-4f4e-9cbf-89599011d970\") " pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.688625 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.688671 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e0c2b77e-cacd-4257-b3c1-699bd3f4261f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lnz4f\" (UID: \"e0c2b77e-cacd-4257-b3c1-699bd3f4261f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lnz4f" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.688707 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktb5p\" (UniqueName: \"kubernetes.io/projected/299ad63a-4584-4079-81c4-b8645326d3d0-kube-api-access-ktb5p\") pod \"controller-manager-879f6c89f-tjkjf\" (UID: \"299ad63a-4584-4079-81c4-b8645326d3d0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tjkjf" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.689202 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f3bbf6e9-0c6a-4d4a-9d35-25f941df2308-trusted-ca\") pod \"console-operator-58897d9998-2mld6\" (UID: \"f3bbf6e9-0c6a-4d4a-9d35-25f941df2308\") " pod="openshift-console-operator/console-operator-58897d9998-2mld6" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.689266 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0643721-5a54-4c37-b857-474ed61ef531-config\") pod \"machine-api-operator-5694c8668f-4mwf4\" (UID: \"f0643721-5a54-4c37-b857-474ed61ef531\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4mwf4" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.689286 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/58844231-adcc-497d-83a3-bba779038cc2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fpbc2\" (UID: \"58844231-adcc-497d-83a3-bba779038cc2\") " pod="openshift-marketplace/marketplace-operator-79b997595-fpbc2" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.689390 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5732e921-c103-4eda-9705-afec618471c9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-wtlds\" (UID: \"5732e921-c103-4eda-9705-afec618471c9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wtlds" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.689434 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdl7j\" (UniqueName: \"kubernetes.io/projected/5732e921-c103-4eda-9705-afec618471c9-kube-api-access-qdl7j\") pod \"kube-storage-version-migrator-operator-b67b599dd-wtlds\" (UID: \"5732e921-c103-4eda-9705-afec618471c9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wtlds" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.689467 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/299ad63a-4584-4079-81c4-b8645326d3d0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-tjkjf\" (UID: \"299ad63a-4584-4079-81c4-b8645326d3d0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tjkjf" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.689505 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3bbf6e9-0c6a-4d4a-9d35-25f941df2308-config\") pod \"console-operator-58897d9998-2mld6\" (UID: \"f3bbf6e9-0c6a-4d4a-9d35-25f941df2308\") " pod="openshift-console-operator/console-operator-58897d9998-2mld6" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.689537 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w99n\" (UniqueName: \"kubernetes.io/projected/6981202b-1a1f-4b0c-b68e-6a45d9914fc8-kube-api-access-5w99n\") pod \"authentication-operator-69f744f599-nhtzb\" (UID: \"6981202b-1a1f-4b0c-b68e-6a45d9914fc8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nhtzb" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.689570 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a14dd71a-cfac-4f4e-9cbf-89599011d970-encryption-config\") pod \"apiserver-76f77b778f-87xcv\" (UID: \"a14dd71a-cfac-4f4e-9cbf-89599011d970\") " pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.689617 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/06270ded-7d6b-4966-9f10-a432c593bdfe-machine-approver-tls\") pod \"machine-approver-56656f9798-htb9k\" (UID: \"06270ded-7d6b-4966-9f10-a432c593bdfe\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-htb9k" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.689647 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25de1778-6e9d-4158-9aa4-02e1607d7f45-serving-cert\") pod \"route-controller-manager-6576b87f9c-c6xqc\" (UID: \"25de1778-6e9d-4158-9aa4-02e1607d7f45\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6xqc" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.689688 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbv74\" (UniqueName: \"kubernetes.io/projected/defd5be8-af50-4f87-a9a5-c166a9e3ce44-kube-api-access-fbv74\") pod \"openshift-apiserver-operator-796bbdcf4f-jqwls\" (UID: \"defd5be8-af50-4f87-a9a5-c166a9e3ce44\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jqwls" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.689718 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92d294e9-091b-420f-aae8-da3bcab119e4-trusted-ca-bundle\") pod \"console-f9d7485db-g5xff\" (UID: \"92d294e9-091b-420f-aae8-da3bcab119e4\") " pod="openshift-console/console-f9d7485db-g5xff" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.690005 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0b661fda-d14e-4491-896c-4d6812a638b5-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xmz5p\" (UID: \"0b661fda-d14e-4491-896c-4d6812a638b5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xmz5p" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.690051 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/92d294e9-091b-420f-aae8-da3bcab119e4-console-serving-cert\") pod \"console-f9d7485db-g5xff\" (UID: \"92d294e9-091b-420f-aae8-da3bcab119e4\") " pod="openshift-console/console-f9d7485db-g5xff" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.690246 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/792fd1c0-de50-429b-89ff-6a3f64541e29-serving-cert\") pod \"apiserver-7bbb656c7d-g2t27\" (UID: \"792fd1c0-de50-429b-89ff-6a3f64541e29\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.690285 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/792fd1c0-de50-429b-89ff-6a3f64541e29-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-g2t27\" (UID: \"792fd1c0-de50-429b-89ff-6a3f64541e29\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.690342 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f0643721-5a54-4c37-b857-474ed61ef531-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-4mwf4\" (UID: \"f0643721-5a54-4c37-b857-474ed61ef531\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4mwf4" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.690398 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8e73ee2-9e80-4123-bece-02008233348f-serving-cert\") pod \"etcd-operator-b45778765-xgbf2\" (UID: \"a8e73ee2-9e80-4123-bece-02008233348f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xgbf2" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.690431 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg95s\" (UniqueName: \"kubernetes.io/projected/e0c2b77e-cacd-4257-b3c1-699bd3f4261f-kube-api-access-kg95s\") pod \"cluster-image-registry-operator-dc59b4c8b-lnz4f\" (UID: \"e0c2b77e-cacd-4257-b3c1-699bd3f4261f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lnz4f" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.690470 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhfk4\" (UniqueName: \"kubernetes.io/projected/a4c1e470-e025-4285-a712-52b4fead0799-kube-api-access-jhfk4\") pod \"dns-operator-744455d44c-d2dh9\" (UID: \"a4c1e470-e025-4285-a712-52b4fead0799\") " pod="openshift-dns-operator/dns-operator-744455d44c-d2dh9" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.690499 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.690529 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.692147 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-qnx85"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.692376 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8bab8f58-226d-43fd-8a33-668c4e060bfb-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-qzhgh\" (UID: \"8bab8f58-226d-43fd-8a33-668c4e060bfb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qzhgh" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.692487 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6981202b-1a1f-4b0c-b68e-6a45d9914fc8-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-nhtzb\" (UID: \"6981202b-1a1f-4b0c-b68e-6a45d9914fc8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nhtzb" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.692631 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a14dd71a-cfac-4f4e-9cbf-89599011d970-image-import-ca\") pod \"apiserver-76f77b778f-87xcv\" (UID: \"a14dd71a-cfac-4f4e-9cbf-89599011d970\") " pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.692659 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a4c1e470-e025-4285-a712-52b4fead0799-metrics-tls\") pod \"dns-operator-744455d44c-d2dh9\" (UID: \"a4c1e470-e025-4285-a712-52b4fead0799\") " pod="openshift-dns-operator/dns-operator-744455d44c-d2dh9" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.692682 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cf538d4f-1acd-4e61-9827-7430d1099138-audit-policies\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.692721 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8l4rw\" (UniqueName: \"kubernetes.io/projected/cf538d4f-1acd-4e61-9827-7430d1099138-kube-api-access-8l4rw\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.692806 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a14dd71a-cfac-4f4e-9cbf-89599011d970-audit\") pod \"apiserver-76f77b778f-87xcv\" (UID: \"a14dd71a-cfac-4f4e-9cbf-89599011d970\") " pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.692938 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9lqk\" (UniqueName: \"kubernetes.io/projected/25de1778-6e9d-4158-9aa4-02e1607d7f45-kube-api-access-z9lqk\") pod \"route-controller-manager-6576b87f9c-c6xqc\" (UID: \"25de1778-6e9d-4158-9aa4-02e1607d7f45\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6xqc" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.692959 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/299ad63a-4584-4079-81c4-b8645326d3d0-client-ca\") pod \"controller-manager-879f6c89f-tjkjf\" (UID: \"299ad63a-4584-4079-81c4-b8645326d3d0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tjkjf" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.692977 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a14dd71a-cfac-4f4e-9cbf-89599011d970-etcd-serving-ca\") pod \"apiserver-76f77b778f-87xcv\" (UID: \"a14dd71a-cfac-4f4e-9cbf-89599011d970\") " pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.693098 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.693119 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xs8l\" (UniqueName: \"kubernetes.io/projected/92d294e9-091b-420f-aae8-da3bcab119e4-kube-api-access-9xs8l\") pod \"console-f9d7485db-g5xff\" (UID: \"92d294e9-091b-420f-aae8-da3bcab119e4\") " pod="openshift-console/console-f9d7485db-g5xff" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.693150 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjhfg\" (UniqueName: \"kubernetes.io/projected/a8e73ee2-9e80-4123-bece-02008233348f-kube-api-access-vjhfg\") pod \"etcd-operator-b45778765-xgbf2\" (UID: \"a8e73ee2-9e80-4123-bece-02008233348f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xgbf2" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.693168 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e0c2b77e-cacd-4257-b3c1-699bd3f4261f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lnz4f\" (UID: \"e0c2b77e-cacd-4257-b3c1-699bd3f4261f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lnz4f" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.693185 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bab8f58-226d-43fd-8a33-668c4e060bfb-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-qzhgh\" (UID: \"8bab8f58-226d-43fd-8a33-668c4e060bfb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qzhgh" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.693207 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/792fd1c0-de50-429b-89ff-6a3f64541e29-audit-policies\") pod \"apiserver-7bbb656c7d-g2t27\" (UID: \"792fd1c0-de50-429b-89ff-6a3f64541e29\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.693704 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-d2dh9"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.694119 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.695531 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xmz5p"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.696467 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jvnk7"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.698361 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.701894 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f0643721-5a54-4c37-b857-474ed61ef531-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-4mwf4\" (UID: \"f0643721-5a54-4c37-b857-474ed61ef531\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4mwf4" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.704985 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-chd9d"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.705096 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-87xcv"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.705136 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-h9hzt"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.705153 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8wg7"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.707950 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kkhmq"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.708087 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wgw78"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.711173 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-42hhs"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.712881 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-5w4w6"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.713073 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-42hhs" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.715890 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-dsqjf"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.716075 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5w4w6" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.743381 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-z22gj"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.743449 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lnz4f"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.743792 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.745247 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-z7rvn"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.746831 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6ggwm"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.746924 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.747402 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-knx8j"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.749290 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvfx4"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.749367 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x959d"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.750260 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-42hhs"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.751201 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5w4w6"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.753505 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6lsk2"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.753788 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.754556 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l5546"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.755480 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530080-85k47"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.756414 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-vh89z"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.757514 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-vh89z"] Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.757614 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-vh89z" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.771220 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.791401 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794148 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/792fd1c0-de50-429b-89ff-6a3f64541e29-serving-cert\") pod \"apiserver-7bbb656c7d-g2t27\" (UID: \"792fd1c0-de50-429b-89ff-6a3f64541e29\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794178 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/792fd1c0-de50-429b-89ff-6a3f64541e29-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-g2t27\" (UID: \"792fd1c0-de50-429b-89ff-6a3f64541e29\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794199 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8e73ee2-9e80-4123-bece-02008233348f-serving-cert\") pod \"etcd-operator-b45778765-xgbf2\" (UID: \"a8e73ee2-9e80-4123-bece-02008233348f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xgbf2" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794232 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg95s\" (UniqueName: \"kubernetes.io/projected/e0c2b77e-cacd-4257-b3c1-699bd3f4261f-kube-api-access-kg95s\") pod \"cluster-image-registry-operator-dc59b4c8b-lnz4f\" (UID: \"e0c2b77e-cacd-4257-b3c1-699bd3f4261f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lnz4f" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794259 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhfk4\" (UniqueName: \"kubernetes.io/projected/a4c1e470-e025-4285-a712-52b4fead0799-kube-api-access-jhfk4\") pod \"dns-operator-744455d44c-d2dh9\" (UID: \"a4c1e470-e025-4285-a712-52b4fead0799\") " pod="openshift-dns-operator/dns-operator-744455d44c-d2dh9" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794286 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794306 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794337 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8bab8f58-226d-43fd-8a33-668c4e060bfb-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-qzhgh\" (UID: \"8bab8f58-226d-43fd-8a33-668c4e060bfb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qzhgh" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794352 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6981202b-1a1f-4b0c-b68e-6a45d9914fc8-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-nhtzb\" (UID: \"6981202b-1a1f-4b0c-b68e-6a45d9914fc8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nhtzb" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794366 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a14dd71a-cfac-4f4e-9cbf-89599011d970-image-import-ca\") pod \"apiserver-76f77b778f-87xcv\" (UID: \"a14dd71a-cfac-4f4e-9cbf-89599011d970\") " pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794379 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a4c1e470-e025-4285-a712-52b4fead0799-metrics-tls\") pod \"dns-operator-744455d44c-d2dh9\" (UID: \"a4c1e470-e025-4285-a712-52b4fead0799\") " pod="openshift-dns-operator/dns-operator-744455d44c-d2dh9" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794394 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cf538d4f-1acd-4e61-9827-7430d1099138-audit-policies\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794409 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8l4rw\" (UniqueName: \"kubernetes.io/projected/cf538d4f-1acd-4e61-9827-7430d1099138-kube-api-access-8l4rw\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794428 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a14dd71a-cfac-4f4e-9cbf-89599011d970-audit\") pod \"apiserver-76f77b778f-87xcv\" (UID: \"a14dd71a-cfac-4f4e-9cbf-89599011d970\") " pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794444 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9lqk\" (UniqueName: \"kubernetes.io/projected/25de1778-6e9d-4158-9aa4-02e1607d7f45-kube-api-access-z9lqk\") pod \"route-controller-manager-6576b87f9c-c6xqc\" (UID: \"25de1778-6e9d-4158-9aa4-02e1607d7f45\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6xqc" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794461 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/299ad63a-4584-4079-81c4-b8645326d3d0-client-ca\") pod \"controller-manager-879f6c89f-tjkjf\" (UID: \"299ad63a-4584-4079-81c4-b8645326d3d0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tjkjf" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794477 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a14dd71a-cfac-4f4e-9cbf-89599011d970-etcd-serving-ca\") pod \"apiserver-76f77b778f-87xcv\" (UID: \"a14dd71a-cfac-4f4e-9cbf-89599011d970\") " pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794492 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794510 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xs8l\" (UniqueName: \"kubernetes.io/projected/92d294e9-091b-420f-aae8-da3bcab119e4-kube-api-access-9xs8l\") pod \"console-f9d7485db-g5xff\" (UID: \"92d294e9-091b-420f-aae8-da3bcab119e4\") " pod="openshift-console/console-f9d7485db-g5xff" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794537 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjhfg\" (UniqueName: \"kubernetes.io/projected/a8e73ee2-9e80-4123-bece-02008233348f-kube-api-access-vjhfg\") pod \"etcd-operator-b45778765-xgbf2\" (UID: \"a8e73ee2-9e80-4123-bece-02008233348f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xgbf2" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794556 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e0c2b77e-cacd-4257-b3c1-699bd3f4261f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lnz4f\" (UID: \"e0c2b77e-cacd-4257-b3c1-699bd3f4261f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lnz4f" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794573 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bab8f58-226d-43fd-8a33-668c4e060bfb-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-qzhgh\" (UID: \"8bab8f58-226d-43fd-8a33-668c4e060bfb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qzhgh" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794593 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/792fd1c0-de50-429b-89ff-6a3f64541e29-audit-policies\") pod \"apiserver-7bbb656c7d-g2t27\" (UID: \"792fd1c0-de50-429b-89ff-6a3f64541e29\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794610 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25de1778-6e9d-4158-9aa4-02e1607d7f45-config\") pod \"route-controller-manager-6576b87f9c-c6xqc\" (UID: \"25de1778-6e9d-4158-9aa4-02e1607d7f45\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6xqc" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794627 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e0c2b77e-cacd-4257-b3c1-699bd3f4261f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lnz4f\" (UID: \"e0c2b77e-cacd-4257-b3c1-699bd3f4261f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lnz4f" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794644 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/92d294e9-091b-420f-aae8-da3bcab119e4-service-ca\") pod \"console-f9d7485db-g5xff\" (UID: \"92d294e9-091b-420f-aae8-da3bcab119e4\") " pod="openshift-console/console-f9d7485db-g5xff" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794659 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a8e73ee2-9e80-4123-bece-02008233348f-etcd-client\") pod \"etcd-operator-b45778765-xgbf2\" (UID: \"a8e73ee2-9e80-4123-bece-02008233348f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xgbf2" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794673 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6981202b-1a1f-4b0c-b68e-6a45d9914fc8-service-ca-bundle\") pod \"authentication-operator-69f744f599-nhtzb\" (UID: \"6981202b-1a1f-4b0c-b68e-6a45d9914fc8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nhtzb" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794688 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a14dd71a-cfac-4f4e-9cbf-89599011d970-config\") pod \"apiserver-76f77b778f-87xcv\" (UID: \"a14dd71a-cfac-4f4e-9cbf-89599011d970\") " pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794703 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/792fd1c0-de50-429b-89ff-6a3f64541e29-encryption-config\") pod \"apiserver-7bbb656c7d-g2t27\" (UID: \"792fd1c0-de50-429b-89ff-6a3f64541e29\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794719 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7rsp\" (UniqueName: \"kubernetes.io/projected/b40b3484-891d-4669-838b-90c1b8d6869e-kube-api-access-l7rsp\") pod \"downloads-7954f5f757-dsqjf\" (UID: \"b40b3484-891d-4669-838b-90c1b8d6869e\") " pod="openshift-console/downloads-7954f5f757-dsqjf" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794735 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7r77\" (UniqueName: \"kubernetes.io/projected/f3bbf6e9-0c6a-4d4a-9d35-25f941df2308-kube-api-access-g7r77\") pod \"console-operator-58897d9998-2mld6\" (UID: \"f3bbf6e9-0c6a-4d4a-9d35-25f941df2308\") " pod="openshift-console-operator/console-operator-58897d9998-2mld6" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794750 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a8e73ee2-9e80-4123-bece-02008233348f-etcd-service-ca\") pod \"etcd-operator-b45778765-xgbf2\" (UID: \"a8e73ee2-9e80-4123-bece-02008233348f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xgbf2" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794766 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794783 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794799 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2b8t\" (UniqueName: \"kubernetes.io/projected/792fd1c0-de50-429b-89ff-6a3f64541e29-kube-api-access-h2b8t\") pod \"apiserver-7bbb656c7d-g2t27\" (UID: \"792fd1c0-de50-429b-89ff-6a3f64541e29\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794814 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a8e73ee2-9e80-4123-bece-02008233348f-etcd-ca\") pod \"etcd-operator-b45778765-xgbf2\" (UID: \"a8e73ee2-9e80-4123-bece-02008233348f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xgbf2" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794835 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a14dd71a-cfac-4f4e-9cbf-89599011d970-trusted-ca-bundle\") pod \"apiserver-76f77b778f-87xcv\" (UID: \"a14dd71a-cfac-4f4e-9cbf-89599011d970\") " pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794892 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a14dd71a-cfac-4f4e-9cbf-89599011d970-audit-dir\") pod \"apiserver-76f77b778f-87xcv\" (UID: \"a14dd71a-cfac-4f4e-9cbf-89599011d970\") " pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794920 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6981202b-1a1f-4b0c-b68e-6a45d9914fc8-config\") pod \"authentication-operator-69f744f599-nhtzb\" (UID: \"6981202b-1a1f-4b0c-b68e-6a45d9914fc8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nhtzb" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794945 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a14dd71a-cfac-4f4e-9cbf-89599011d970-node-pullsecrets\") pod \"apiserver-76f77b778f-87xcv\" (UID: \"a14dd71a-cfac-4f4e-9cbf-89599011d970\") " pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794960 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58844231-adcc-497d-83a3-bba779038cc2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fpbc2\" (UID: \"58844231-adcc-497d-83a3-bba779038cc2\") " pod="openshift-marketplace/marketplace-operator-79b997595-fpbc2" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794976 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/792fd1c0-de50-429b-89ff-6a3f64541e29-etcd-client\") pod \"apiserver-7bbb656c7d-g2t27\" (UID: \"792fd1c0-de50-429b-89ff-6a3f64541e29\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.794994 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp44v\" (UniqueName: \"kubernetes.io/projected/a14dd71a-cfac-4f4e-9cbf-89599011d970-kube-api-access-zp44v\") pod \"apiserver-76f77b778f-87xcv\" (UID: \"a14dd71a-cfac-4f4e-9cbf-89599011d970\") " pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795009 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795030 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a14dd71a-cfac-4f4e-9cbf-89599011d970-serving-cert\") pod \"apiserver-76f77b778f-87xcv\" (UID: \"a14dd71a-cfac-4f4e-9cbf-89599011d970\") " pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795044 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6cb942f6-2bc5-4662-929c-849ec29baddc-serviceca\") pod \"image-pruner-29530080-8vgj6\" (UID: \"6cb942f6-2bc5-4662-929c-849ec29baddc\") " pod="openshift-image-registry/image-pruner-29530080-8vgj6" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795059 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mp6c9\" (UniqueName: \"kubernetes.io/projected/06270ded-7d6b-4966-9f10-a432c593bdfe-kube-api-access-mp6c9\") pod \"machine-approver-56656f9798-htb9k\" (UID: \"06270ded-7d6b-4966-9f10-a432c593bdfe\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-htb9k" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795075 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/92d294e9-091b-420f-aae8-da3bcab119e4-console-oauth-config\") pod \"console-f9d7485db-g5xff\" (UID: \"92d294e9-091b-420f-aae8-da3bcab119e4\") " pod="openshift-console/console-f9d7485db-g5xff" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795090 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/299ad63a-4584-4079-81c4-b8645326d3d0-serving-cert\") pod \"controller-manager-879f6c89f-tjkjf\" (UID: \"299ad63a-4584-4079-81c4-b8645326d3d0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tjkjf" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795107 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drd98\" (UniqueName: \"kubernetes.io/projected/0b661fda-d14e-4491-896c-4d6812a638b5-kube-api-access-drd98\") pod \"control-plane-machine-set-operator-78cbb6b69f-xmz5p\" (UID: \"0b661fda-d14e-4491-896c-4d6812a638b5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xmz5p" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795257 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25de1778-6e9d-4158-9aa4-02e1607d7f45-client-ca\") pod \"route-controller-manager-6576b87f9c-c6xqc\" (UID: \"25de1778-6e9d-4158-9aa4-02e1607d7f45\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6xqc" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795277 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795300 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/92d294e9-091b-420f-aae8-da3bcab119e4-oauth-serving-cert\") pod \"console-f9d7485db-g5xff\" (UID: \"92d294e9-091b-420f-aae8-da3bcab119e4\") " pod="openshift-console/console-f9d7485db-g5xff" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795322 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6af08a92-85a9-4200-a0d8-abff73e0e93b-serving-cert\") pod \"openshift-config-operator-7777fb866f-qnx85\" (UID: \"6af08a92-85a9-4200-a0d8-abff73e0e93b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qnx85" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795339 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cf538d4f-1acd-4e61-9827-7430d1099138-audit-dir\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795355 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795375 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/06270ded-7d6b-4966-9f10-a432c593bdfe-auth-proxy-config\") pod \"machine-approver-56656f9798-htb9k\" (UID: \"06270ded-7d6b-4966-9f10-a432c593bdfe\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-htb9k" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795394 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06270ded-7d6b-4966-9f10-a432c593bdfe-config\") pod \"machine-approver-56656f9798-htb9k\" (UID: \"06270ded-7d6b-4966-9f10-a432c593bdfe\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-htb9k" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795410 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvdzx\" (UniqueName: \"kubernetes.io/projected/58844231-adcc-497d-83a3-bba779038cc2-kube-api-access-hvdzx\") pod \"marketplace-operator-79b997595-fpbc2\" (UID: \"58844231-adcc-497d-83a3-bba779038cc2\") " pod="openshift-marketplace/marketplace-operator-79b997595-fpbc2" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795424 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/792fd1c0-de50-429b-89ff-6a3f64541e29-audit-dir\") pod \"apiserver-7bbb656c7d-g2t27\" (UID: \"792fd1c0-de50-429b-89ff-6a3f64541e29\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795444 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795458 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3bbf6e9-0c6a-4d4a-9d35-25f941df2308-serving-cert\") pod \"console-operator-58897d9998-2mld6\" (UID: \"f3bbf6e9-0c6a-4d4a-9d35-25f941df2308\") " pod="openshift-console-operator/console-operator-58897d9998-2mld6" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795472 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8e73ee2-9e80-4123-bece-02008233348f-config\") pod \"etcd-operator-b45778765-xgbf2\" (UID: \"a8e73ee2-9e80-4123-bece-02008233348f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xgbf2" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795486 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795501 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/92d294e9-091b-420f-aae8-da3bcab119e4-console-config\") pod \"console-f9d7485db-g5xff\" (UID: \"92d294e9-091b-420f-aae8-da3bcab119e4\") " pod="openshift-console/console-f9d7485db-g5xff" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795516 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/6af08a92-85a9-4200-a0d8-abff73e0e93b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-qnx85\" (UID: \"6af08a92-85a9-4200-a0d8-abff73e0e93b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qnx85" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795539 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/792fd1c0-de50-429b-89ff-6a3f64541e29-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-g2t27\" (UID: \"792fd1c0-de50-429b-89ff-6a3f64541e29\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795554 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8w44v\" (UniqueName: \"kubernetes.io/projected/6af08a92-85a9-4200-a0d8-abff73e0e93b-kube-api-access-8w44v\") pod \"openshift-config-operator-7777fb866f-qnx85\" (UID: \"6af08a92-85a9-4200-a0d8-abff73e0e93b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qnx85" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795576 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5732e921-c103-4eda-9705-afec618471c9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-wtlds\" (UID: \"5732e921-c103-4eda-9705-afec618471c9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wtlds" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795592 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj2xf\" (UniqueName: \"kubernetes.io/projected/6cb942f6-2bc5-4662-929c-849ec29baddc-kube-api-access-pj2xf\") pod \"image-pruner-29530080-8vgj6\" (UID: \"6cb942f6-2bc5-4662-929c-849ec29baddc\") " pod="openshift-image-registry/image-pruner-29530080-8vgj6" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795607 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6981202b-1a1f-4b0c-b68e-6a45d9914fc8-serving-cert\") pod \"authentication-operator-69f744f599-nhtzb\" (UID: \"6981202b-1a1f-4b0c-b68e-6a45d9914fc8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nhtzb" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795624 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhprn\" (UniqueName: \"kubernetes.io/projected/8bab8f58-226d-43fd-8a33-668c4e060bfb-kube-api-access-xhprn\") pod \"openshift-controller-manager-operator-756b6f6bc6-qzhgh\" (UID: \"8bab8f58-226d-43fd-8a33-668c4e060bfb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qzhgh" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795638 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/299ad63a-4584-4079-81c4-b8645326d3d0-config\") pod \"controller-manager-879f6c89f-tjkjf\" (UID: \"299ad63a-4584-4079-81c4-b8645326d3d0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tjkjf" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795651 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a14dd71a-cfac-4f4e-9cbf-89599011d970-etcd-client\") pod \"apiserver-76f77b778f-87xcv\" (UID: \"a14dd71a-cfac-4f4e-9cbf-89599011d970\") " pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795666 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795683 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e0c2b77e-cacd-4257-b3c1-699bd3f4261f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lnz4f\" (UID: \"e0c2b77e-cacd-4257-b3c1-699bd3f4261f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lnz4f" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795700 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktb5p\" (UniqueName: \"kubernetes.io/projected/299ad63a-4584-4079-81c4-b8645326d3d0-kube-api-access-ktb5p\") pod \"controller-manager-879f6c89f-tjkjf\" (UID: \"299ad63a-4584-4079-81c4-b8645326d3d0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tjkjf" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795717 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f3bbf6e9-0c6a-4d4a-9d35-25f941df2308-trusted-ca\") pod \"console-operator-58897d9998-2mld6\" (UID: \"f3bbf6e9-0c6a-4d4a-9d35-25f941df2308\") " pod="openshift-console-operator/console-operator-58897d9998-2mld6" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795740 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/58844231-adcc-497d-83a3-bba779038cc2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fpbc2\" (UID: \"58844231-adcc-497d-83a3-bba779038cc2\") " pod="openshift-marketplace/marketplace-operator-79b997595-fpbc2" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795763 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5732e921-c103-4eda-9705-afec618471c9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-wtlds\" (UID: \"5732e921-c103-4eda-9705-afec618471c9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wtlds" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795780 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdl7j\" (UniqueName: \"kubernetes.io/projected/5732e921-c103-4eda-9705-afec618471c9-kube-api-access-qdl7j\") pod \"kube-storage-version-migrator-operator-b67b599dd-wtlds\" (UID: \"5732e921-c103-4eda-9705-afec618471c9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wtlds" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795795 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/299ad63a-4584-4079-81c4-b8645326d3d0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-tjkjf\" (UID: \"299ad63a-4584-4079-81c4-b8645326d3d0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tjkjf" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795811 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3bbf6e9-0c6a-4d4a-9d35-25f941df2308-config\") pod \"console-operator-58897d9998-2mld6\" (UID: \"f3bbf6e9-0c6a-4d4a-9d35-25f941df2308\") " pod="openshift-console-operator/console-operator-58897d9998-2mld6" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795828 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w99n\" (UniqueName: \"kubernetes.io/projected/6981202b-1a1f-4b0c-b68e-6a45d9914fc8-kube-api-access-5w99n\") pod \"authentication-operator-69f744f599-nhtzb\" (UID: \"6981202b-1a1f-4b0c-b68e-6a45d9914fc8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nhtzb" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795846 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a14dd71a-cfac-4f4e-9cbf-89599011d970-encryption-config\") pod \"apiserver-76f77b778f-87xcv\" (UID: \"a14dd71a-cfac-4f4e-9cbf-89599011d970\") " pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795912 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/06270ded-7d6b-4966-9f10-a432c593bdfe-machine-approver-tls\") pod \"machine-approver-56656f9798-htb9k\" (UID: \"06270ded-7d6b-4966-9f10-a432c593bdfe\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-htb9k" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795928 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25de1778-6e9d-4158-9aa4-02e1607d7f45-serving-cert\") pod \"route-controller-manager-6576b87f9c-c6xqc\" (UID: \"25de1778-6e9d-4158-9aa4-02e1607d7f45\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6xqc" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795948 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92d294e9-091b-420f-aae8-da3bcab119e4-trusted-ca-bundle\") pod \"console-f9d7485db-g5xff\" (UID: \"92d294e9-091b-420f-aae8-da3bcab119e4\") " pod="openshift-console/console-f9d7485db-g5xff" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795965 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0b661fda-d14e-4491-896c-4d6812a638b5-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xmz5p\" (UID: \"0b661fda-d14e-4491-896c-4d6812a638b5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xmz5p" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.795980 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/92d294e9-091b-420f-aae8-da3bcab119e4-console-serving-cert\") pod \"console-f9d7485db-g5xff\" (UID: \"92d294e9-091b-420f-aae8-da3bcab119e4\") " pod="openshift-console/console-f9d7485db-g5xff" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.796731 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/792fd1c0-de50-429b-89ff-6a3f64541e29-serving-cert\") pod \"apiserver-7bbb656c7d-g2t27\" (UID: \"792fd1c0-de50-429b-89ff-6a3f64541e29\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.797600 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/792fd1c0-de50-429b-89ff-6a3f64541e29-audit-policies\") pod \"apiserver-7bbb656c7d-g2t27\" (UID: \"792fd1c0-de50-429b-89ff-6a3f64541e29\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.797834 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a14dd71a-cfac-4f4e-9cbf-89599011d970-etcd-serving-ca\") pod \"apiserver-76f77b778f-87xcv\" (UID: \"a14dd71a-cfac-4f4e-9cbf-89599011d970\") " pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.797972 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cf538d4f-1acd-4e61-9827-7430d1099138-audit-policies\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.798249 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6981202b-1a1f-4b0c-b68e-6a45d9914fc8-service-ca-bundle\") pod \"authentication-operator-69f744f599-nhtzb\" (UID: \"6981202b-1a1f-4b0c-b68e-6a45d9914fc8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nhtzb" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.798565 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/792fd1c0-de50-429b-89ff-6a3f64541e29-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-g2t27\" (UID: \"792fd1c0-de50-429b-89ff-6a3f64541e29\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.798703 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/299ad63a-4584-4079-81c4-b8645326d3d0-client-ca\") pod \"controller-manager-879f6c89f-tjkjf\" (UID: \"299ad63a-4584-4079-81c4-b8645326d3d0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tjkjf" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.799107 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/92d294e9-091b-420f-aae8-da3bcab119e4-service-ca\") pod \"console-f9d7485db-g5xff\" (UID: \"92d294e9-091b-420f-aae8-da3bcab119e4\") " pod="openshift-console/console-f9d7485db-g5xff" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.799113 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/792fd1c0-de50-429b-89ff-6a3f64541e29-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-g2t27\" (UID: \"792fd1c0-de50-429b-89ff-6a3f64541e29\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.799138 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/92d294e9-091b-420f-aae8-da3bcab119e4-console-serving-cert\") pod \"console-f9d7485db-g5xff\" (UID: \"92d294e9-091b-420f-aae8-da3bcab119e4\") " pod="openshift-console/console-f9d7485db-g5xff" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.799380 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/06270ded-7d6b-4966-9f10-a432c593bdfe-auth-proxy-config\") pod \"machine-approver-56656f9798-htb9k\" (UID: \"06270ded-7d6b-4966-9f10-a432c593bdfe\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-htb9k" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.799499 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a14dd71a-cfac-4f4e-9cbf-89599011d970-serving-cert\") pod \"apiserver-76f77b778f-87xcv\" (UID: \"a14dd71a-cfac-4f4e-9cbf-89599011d970\") " pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.799844 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06270ded-7d6b-4966-9f10-a432c593bdfe-config\") pod \"machine-approver-56656f9798-htb9k\" (UID: \"06270ded-7d6b-4966-9f10-a432c593bdfe\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-htb9k" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.799943 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a14dd71a-cfac-4f4e-9cbf-89599011d970-config\") pod \"apiserver-76f77b778f-87xcv\" (UID: \"a14dd71a-cfac-4f4e-9cbf-89599011d970\") " pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.799991 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/792fd1c0-de50-429b-89ff-6a3f64541e29-audit-dir\") pod \"apiserver-7bbb656c7d-g2t27\" (UID: \"792fd1c0-de50-429b-89ff-6a3f64541e29\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.800325 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.800429 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6cb942f6-2bc5-4662-929c-849ec29baddc-serviceca\") pod \"image-pruner-29530080-8vgj6\" (UID: \"6cb942f6-2bc5-4662-929c-849ec29baddc\") " pod="openshift-image-registry/image-pruner-29530080-8vgj6" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.800629 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.800869 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bab8f58-226d-43fd-8a33-668c4e060bfb-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-qzhgh\" (UID: \"8bab8f58-226d-43fd-8a33-668c4e060bfb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qzhgh" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.801046 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a8e73ee2-9e80-4123-bece-02008233348f-etcd-ca\") pod \"etcd-operator-b45778765-xgbf2\" (UID: \"a8e73ee2-9e80-4123-bece-02008233348f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xgbf2" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.801468 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f3bbf6e9-0c6a-4d4a-9d35-25f941df2308-trusted-ca\") pod \"console-operator-58897d9998-2mld6\" (UID: \"f3bbf6e9-0c6a-4d4a-9d35-25f941df2308\") " pod="openshift-console-operator/console-operator-58897d9998-2mld6" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.801548 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25de1778-6e9d-4158-9aa4-02e1607d7f45-config\") pod \"route-controller-manager-6576b87f9c-c6xqc\" (UID: \"25de1778-6e9d-4158-9aa4-02e1607d7f45\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6xqc" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.801944 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a8e73ee2-9e80-4123-bece-02008233348f-etcd-service-ca\") pod \"etcd-operator-b45778765-xgbf2\" (UID: \"a8e73ee2-9e80-4123-bece-02008233348f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xgbf2" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.802218 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5732e921-c103-4eda-9705-afec618471c9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-wtlds\" (UID: \"5732e921-c103-4eda-9705-afec618471c9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wtlds" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.802275 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5732e921-c103-4eda-9705-afec618471c9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-wtlds\" (UID: \"5732e921-c103-4eda-9705-afec618471c9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wtlds" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.802314 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8e73ee2-9e80-4123-bece-02008233348f-config\") pod \"etcd-operator-b45778765-xgbf2\" (UID: \"a8e73ee2-9e80-4123-bece-02008233348f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xgbf2" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.802339 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/92d294e9-091b-420f-aae8-da3bcab119e4-console-config\") pod \"console-f9d7485db-g5xff\" (UID: \"92d294e9-091b-420f-aae8-da3bcab119e4\") " pod="openshift-console/console-f9d7485db-g5xff" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.802573 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/299ad63a-4584-4079-81c4-b8645326d3d0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-tjkjf\" (UID: \"299ad63a-4584-4079-81c4-b8645326d3d0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tjkjf" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.802756 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/6af08a92-85a9-4200-a0d8-abff73e0e93b-available-featuregates\") pod \"openshift-config-operator-7777fb866f-qnx85\" (UID: \"6af08a92-85a9-4200-a0d8-abff73e0e93b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qnx85" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.803330 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a14dd71a-cfac-4f4e-9cbf-89599011d970-audit\") pod \"apiserver-76f77b778f-87xcv\" (UID: \"a14dd71a-cfac-4f4e-9cbf-89599011d970\") " pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.803424 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.803494 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3bbf6e9-0c6a-4d4a-9d35-25f941df2308-serving-cert\") pod \"console-operator-58897d9998-2mld6\" (UID: \"f3bbf6e9-0c6a-4d4a-9d35-25f941df2308\") " pod="openshift-console-operator/console-operator-58897d9998-2mld6" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.803589 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a14dd71a-cfac-4f4e-9cbf-89599011d970-trusted-ca-bundle\") pod \"apiserver-76f77b778f-87xcv\" (UID: \"a14dd71a-cfac-4f4e-9cbf-89599011d970\") " pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.803622 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a14dd71a-cfac-4f4e-9cbf-89599011d970-audit-dir\") pod \"apiserver-76f77b778f-87xcv\" (UID: \"a14dd71a-cfac-4f4e-9cbf-89599011d970\") " pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.803658 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e0c2b77e-cacd-4257-b3c1-699bd3f4261f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lnz4f\" (UID: \"e0c2b77e-cacd-4257-b3c1-699bd3f4261f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lnz4f" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.803712 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cf538d4f-1acd-4e61-9827-7430d1099138-audit-dir\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.804234 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.804321 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a14dd71a-cfac-4f4e-9cbf-89599011d970-node-pullsecrets\") pod \"apiserver-76f77b778f-87xcv\" (UID: \"a14dd71a-cfac-4f4e-9cbf-89599011d970\") " pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.804464 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6981202b-1a1f-4b0c-b68e-6a45d9914fc8-config\") pod \"authentication-operator-69f744f599-nhtzb\" (UID: \"6981202b-1a1f-4b0c-b68e-6a45d9914fc8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nhtzb" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.804550 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.804258 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/299ad63a-4584-4079-81c4-b8645326d3d0-config\") pod \"controller-manager-879f6c89f-tjkjf\" (UID: \"299ad63a-4584-4079-81c4-b8645326d3d0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tjkjf" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.805073 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/92d294e9-091b-420f-aae8-da3bcab119e4-oauth-serving-cert\") pod \"console-f9d7485db-g5xff\" (UID: \"92d294e9-091b-420f-aae8-da3bcab119e4\") " pod="openshift-console/console-f9d7485db-g5xff" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.805312 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/92d294e9-091b-420f-aae8-da3bcab119e4-trusted-ca-bundle\") pod \"console-f9d7485db-g5xff\" (UID: \"92d294e9-091b-420f-aae8-da3bcab119e4\") " pod="openshift-console/console-f9d7485db-g5xff" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.805436 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3bbf6e9-0c6a-4d4a-9d35-25f941df2308-config\") pod \"console-operator-58897d9998-2mld6\" (UID: \"f3bbf6e9-0c6a-4d4a-9d35-25f941df2308\") " pod="openshift-console-operator/console-operator-58897d9998-2mld6" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.805506 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6981202b-1a1f-4b0c-b68e-6a45d9914fc8-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-nhtzb\" (UID: \"6981202b-1a1f-4b0c-b68e-6a45d9914fc8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nhtzb" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.805527 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a8e73ee2-9e80-4123-bece-02008233348f-etcd-client\") pod \"etcd-operator-b45778765-xgbf2\" (UID: \"a8e73ee2-9e80-4123-bece-02008233348f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xgbf2" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.805611 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e0c2b77e-cacd-4257-b3c1-699bd3f4261f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lnz4f\" (UID: \"e0c2b77e-cacd-4257-b3c1-699bd3f4261f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lnz4f" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.805645 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8e73ee2-9e80-4123-bece-02008233348f-serving-cert\") pod \"etcd-operator-b45778765-xgbf2\" (UID: \"a8e73ee2-9e80-4123-bece-02008233348f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xgbf2" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.806253 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a14dd71a-cfac-4f4e-9cbf-89599011d970-image-import-ca\") pod \"apiserver-76f77b778f-87xcv\" (UID: \"a14dd71a-cfac-4f4e-9cbf-89599011d970\") " pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.806315 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25de1778-6e9d-4158-9aa4-02e1607d7f45-client-ca\") pod \"route-controller-manager-6576b87f9c-c6xqc\" (UID: \"25de1778-6e9d-4158-9aa4-02e1607d7f45\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6xqc" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.806562 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/792fd1c0-de50-429b-89ff-6a3f64541e29-encryption-config\") pod \"apiserver-7bbb656c7d-g2t27\" (UID: \"792fd1c0-de50-429b-89ff-6a3f64541e29\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.807013 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/92d294e9-091b-420f-aae8-da3bcab119e4-console-oauth-config\") pod \"console-f9d7485db-g5xff\" (UID: \"92d294e9-091b-420f-aae8-da3bcab119e4\") " pod="openshift-console/console-f9d7485db-g5xff" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.807124 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6981202b-1a1f-4b0c-b68e-6a45d9914fc8-serving-cert\") pod \"authentication-operator-69f744f599-nhtzb\" (UID: \"6981202b-1a1f-4b0c-b68e-6a45d9914fc8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nhtzb" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.807217 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25de1778-6e9d-4158-9aa4-02e1607d7f45-serving-cert\") pod \"route-controller-manager-6576b87f9c-c6xqc\" (UID: \"25de1778-6e9d-4158-9aa4-02e1607d7f45\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6xqc" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.807835 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.807885 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a14dd71a-cfac-4f4e-9cbf-89599011d970-etcd-client\") pod \"apiserver-76f77b778f-87xcv\" (UID: \"a14dd71a-cfac-4f4e-9cbf-89599011d970\") " pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.808064 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/06270ded-7d6b-4966-9f10-a432c593bdfe-machine-approver-tls\") pod \"machine-approver-56656f9798-htb9k\" (UID: \"06270ded-7d6b-4966-9f10-a432c593bdfe\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-htb9k" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.808391 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.808731 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.808905 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.809089 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.809098 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.810192 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a14dd71a-cfac-4f4e-9cbf-89599011d970-encryption-config\") pod \"apiserver-76f77b778f-87xcv\" (UID: \"a14dd71a-cfac-4f4e-9cbf-89599011d970\") " pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.810244 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8bab8f58-226d-43fd-8a33-668c4e060bfb-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-qzhgh\" (UID: \"8bab8f58-226d-43fd-8a33-668c4e060bfb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qzhgh" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.810833 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/299ad63a-4584-4079-81c4-b8645326d3d0-serving-cert\") pod \"controller-manager-879f6c89f-tjkjf\" (UID: \"299ad63a-4584-4079-81c4-b8645326d3d0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tjkjf" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.811606 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.816400 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/792fd1c0-de50-429b-89ff-6a3f64541e29-etcd-client\") pod \"apiserver-7bbb656c7d-g2t27\" (UID: \"792fd1c0-de50-429b-89ff-6a3f64541e29\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.821794 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/58844231-adcc-497d-83a3-bba779038cc2-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-fpbc2\" (UID: \"58844231-adcc-497d-83a3-bba779038cc2\") " pod="openshift-marketplace/marketplace-operator-79b997595-fpbc2" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.840602 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.848507 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58844231-adcc-497d-83a3-bba779038cc2-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-fpbc2\" (UID: \"58844231-adcc-497d-83a3-bba779038cc2\") " pod="openshift-marketplace/marketplace-operator-79b997595-fpbc2" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.855579 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.891596 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.898481 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0b661fda-d14e-4491-896c-4d6812a638b5-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-xmz5p\" (UID: \"0b661fda-d14e-4491-896c-4d6812a638b5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xmz5p" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.916278 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.932304 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.951996 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.973888 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.989143 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6af08a92-85a9-4200-a0d8-abff73e0e93b-serving-cert\") pod \"openshift-config-operator-7777fb866f-qnx85\" (UID: \"6af08a92-85a9-4200-a0d8-abff73e0e93b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qnx85" Feb 23 00:09:40 crc kubenswrapper[4735]: I0223 00:09:40.991465 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.011436 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.032475 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.055160 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.069391 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a4c1e470-e025-4285-a712-52b4fead0799-metrics-tls\") pod \"dns-operator-744455d44c-d2dh9\" (UID: \"a4c1e470-e025-4285-a712-52b4fead0799\") " pod="openshift-dns-operator/dns-operator-744455d44c-d2dh9" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.072550 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.093089 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.112235 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.133206 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.152130 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.172983 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.191758 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.212397 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.232479 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.252725 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.272118 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.291742 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.311575 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.333251 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.353369 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.372981 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.391805 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.411952 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.433325 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.452358 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.471378 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.493063 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.512118 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.533443 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.552614 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.571970 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.592306 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.612734 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.632408 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.652802 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.670030 4735 request.go:700] Waited for 1.018622869s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca/configmaps?fieldSelector=metadata.name%3Dsigning-cabundle&limit=500&resourceVersion=0 Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.671817 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 23 00:09:41 crc kubenswrapper[4735]: E0223 00:09:41.689054 4735 configmap.go:193] Couldn't get configMap openshift-apiserver-operator/openshift-apiserver-operator-config: failed to sync configmap cache: timed out waiting for the condition Feb 23 00:09:41 crc kubenswrapper[4735]: E0223 00:09:41.689394 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/defd5be8-af50-4f87-a9a5-c166a9e3ce44-config podName:defd5be8-af50-4f87-a9a5-c166a9e3ce44 nodeName:}" failed. No retries permitted until 2026-02-23 00:09:42.189353378 +0000 UTC m=+140.652899399 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/defd5be8-af50-4f87-a9a5-c166a9e3ce44-config") pod "openshift-apiserver-operator-796bbdcf4f-jqwls" (UID: "defd5be8-af50-4f87-a9a5-c166a9e3ce44") : failed to sync configmap cache: timed out waiting for the condition Feb 23 00:09:41 crc kubenswrapper[4735]: E0223 00:09:41.689080 4735 secret.go:188] Couldn't get secret openshift-apiserver-operator/openshift-apiserver-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 23 00:09:41 crc kubenswrapper[4735]: E0223 00:09:41.690140 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/defd5be8-af50-4f87-a9a5-c166a9e3ce44-serving-cert podName:defd5be8-af50-4f87-a9a5-c166a9e3ce44 nodeName:}" failed. No retries permitted until 2026-02-23 00:09:42.190116597 +0000 UTC m=+140.653662608 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/defd5be8-af50-4f87-a9a5-c166a9e3ce44-serving-cert") pod "openshift-apiserver-operator-796bbdcf4f-jqwls" (UID: "defd5be8-af50-4f87-a9a5-c166a9e3ce44") : failed to sync secret cache: timed out waiting for the condition Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.693610 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.713160 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.732363 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.751959 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.773100 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.792009 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.812430 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.832519 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.853102 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.873297 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.892412 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.912126 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.932238 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.952150 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.971660 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 23 00:09:41 crc kubenswrapper[4735]: I0223 00:09:41.991807 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.012904 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.032302 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.059741 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.071292 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.092677 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.113067 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.132152 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.152146 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.172206 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.192236 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.213072 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.214647 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/defd5be8-af50-4f87-a9a5-c166a9e3ce44-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jqwls\" (UID: \"defd5be8-af50-4f87-a9a5-c166a9e3ce44\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jqwls" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.215166 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/defd5be8-af50-4f87-a9a5-c166a9e3ce44-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jqwls\" (UID: \"defd5be8-af50-4f87-a9a5-c166a9e3ce44\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jqwls" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.232492 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.252960 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.279693 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.293097 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.312829 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.333332 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.352450 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.400624 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khfnz\" (UniqueName: \"kubernetes.io/projected/f0643721-5a54-4c37-b857-474ed61ef531-kube-api-access-khfnz\") pod \"machine-api-operator-5694c8668f-4mwf4\" (UID: \"f0643721-5a54-4c37-b857-474ed61ef531\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4mwf4" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.433967 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.453317 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.472587 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.492332 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.512027 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.533019 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.552488 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.572182 4735 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.591814 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.612295 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.653827 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-4mwf4" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.660591 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8l4rw\" (UniqueName: \"kubernetes.io/projected/cf538d4f-1acd-4e61-9827-7430d1099138-kube-api-access-8l4rw\") pod \"oauth-openshift-558db77b4-jvnk7\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.670146 4735 request.go:700] Waited for 1.870485454s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/serviceaccounts/etcd-operator/token Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.686215 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xs8l\" (UniqueName: \"kubernetes.io/projected/92d294e9-091b-420f-aae8-da3bcab119e4-kube-api-access-9xs8l\") pod \"console-f9d7485db-g5xff\" (UID: \"92d294e9-091b-420f-aae8-da3bcab119e4\") " pod="openshift-console/console-f9d7485db-g5xff" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.700940 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjhfg\" (UniqueName: \"kubernetes.io/projected/a8e73ee2-9e80-4123-bece-02008233348f-kube-api-access-vjhfg\") pod \"etcd-operator-b45778765-xgbf2\" (UID: \"a8e73ee2-9e80-4123-bece-02008233348f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xgbf2" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.725268 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e0c2b77e-cacd-4257-b3c1-699bd3f4261f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lnz4f\" (UID: \"e0c2b77e-cacd-4257-b3c1-699bd3f4261f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lnz4f" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.743043 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvdzx\" (UniqueName: \"kubernetes.io/projected/58844231-adcc-497d-83a3-bba779038cc2-kube-api-access-hvdzx\") pod \"marketplace-operator-79b997595-fpbc2\" (UID: \"58844231-adcc-497d-83a3-bba779038cc2\") " pod="openshift-marketplace/marketplace-operator-79b997595-fpbc2" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.756726 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2b8t\" (UniqueName: \"kubernetes.io/projected/792fd1c0-de50-429b-89ff-6a3f64541e29-kube-api-access-h2b8t\") pod \"apiserver-7bbb656c7d-g2t27\" (UID: \"792fd1c0-de50-429b-89ff-6a3f64541e29\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.775051 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp6c9\" (UniqueName: \"kubernetes.io/projected/06270ded-7d6b-4966-9f10-a432c593bdfe-kube-api-access-mp6c9\") pod \"machine-approver-56656f9798-htb9k\" (UID: \"06270ded-7d6b-4966-9f10-a432c593bdfe\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-htb9k" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.791316 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w44v\" (UniqueName: \"kubernetes.io/projected/6af08a92-85a9-4200-a0d8-abff73e0e93b-kube-api-access-8w44v\") pod \"openshift-config-operator-7777fb866f-qnx85\" (UID: \"6af08a92-85a9-4200-a0d8-abff73e0e93b\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qnx85" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.799012 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.815537 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdl7j\" (UniqueName: \"kubernetes.io/projected/5732e921-c103-4eda-9705-afec618471c9-kube-api-access-qdl7j\") pod \"kube-storage-version-migrator-operator-b67b599dd-wtlds\" (UID: \"5732e921-c103-4eda-9705-afec618471c9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wtlds" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.839487 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7rsp\" (UniqueName: \"kubernetes.io/projected/b40b3484-891d-4669-838b-90c1b8d6869e-kube-api-access-l7rsp\") pod \"downloads-7954f5f757-dsqjf\" (UID: \"b40b3484-891d-4669-838b-90c1b8d6869e\") " pod="openshift-console/downloads-7954f5f757-dsqjf" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.846458 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.850228 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7r77\" (UniqueName: \"kubernetes.io/projected/f3bbf6e9-0c6a-4d4a-9d35-25f941df2308-kube-api-access-g7r77\") pod \"console-operator-58897d9998-2mld6\" (UID: \"f3bbf6e9-0c6a-4d4a-9d35-25f941df2308\") " pod="openshift-console-operator/console-operator-58897d9998-2mld6" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.858710 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-dsqjf" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.861140 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-2mld6" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.872056 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj2xf\" (UniqueName: \"kubernetes.io/projected/6cb942f6-2bc5-4662-929c-849ec29baddc-kube-api-access-pj2xf\") pod \"image-pruner-29530080-8vgj6\" (UID: \"6cb942f6-2bc5-4662-929c-849ec29baddc\") " pod="openshift-image-registry/image-pruner-29530080-8vgj6" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.877594 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-xgbf2" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.885226 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-g5xff" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.889841 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9lqk\" (UniqueName: \"kubernetes.io/projected/25de1778-6e9d-4158-9aa4-02e1607d7f45-kube-api-access-z9lqk\") pod \"route-controller-manager-6576b87f9c-c6xqc\" (UID: \"25de1778-6e9d-4158-9aa4-02e1607d7f45\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6xqc" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.920365 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhprn\" (UniqueName: \"kubernetes.io/projected/8bab8f58-226d-43fd-8a33-668c4e060bfb-kube-api-access-xhprn\") pod \"openshift-controller-manager-operator-756b6f6bc6-qzhgh\" (UID: \"8bab8f58-226d-43fd-8a33-668c4e060bfb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qzhgh" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.934656 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-4mwf4"] Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.935024 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktb5p\" (UniqueName: \"kubernetes.io/projected/299ad63a-4584-4079-81c4-b8645326d3d0-kube-api-access-ktb5p\") pod \"controller-manager-879f6c89f-tjkjf\" (UID: \"299ad63a-4584-4079-81c4-b8645326d3d0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tjkjf" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.947638 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhfk4\" (UniqueName: \"kubernetes.io/projected/a4c1e470-e025-4285-a712-52b4fead0799-kube-api-access-jhfk4\") pod \"dns-operator-744455d44c-d2dh9\" (UID: \"a4c1e470-e025-4285-a712-52b4fead0799\") " pod="openshift-dns-operator/dns-operator-744455d44c-d2dh9" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.960126 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qzhgh" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.963810 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29530080-8vgj6" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.969247 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wtlds" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.974083 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg95s\" (UniqueName: \"kubernetes.io/projected/e0c2b77e-cacd-4257-b3c1-699bd3f4261f-kube-api-access-kg95s\") pod \"cluster-image-registry-operator-dc59b4c8b-lnz4f\" (UID: \"e0c2b77e-cacd-4257-b3c1-699bd3f4261f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lnz4f" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.980149 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fpbc2" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.989460 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w99n\" (UniqueName: \"kubernetes.io/projected/6981202b-1a1f-4b0c-b68e-6a45d9914fc8-kube-api-access-5w99n\") pod \"authentication-operator-69f744f599-nhtzb\" (UID: \"6981202b-1a1f-4b0c-b68e-6a45d9914fc8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-nhtzb" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.993899 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6xqc" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.994005 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qnx85" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.994282 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tjkjf" Feb 23 00:09:42 crc kubenswrapper[4735]: I0223 00:09:42.999307 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-d2dh9" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.010052 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp44v\" (UniqueName: \"kubernetes.io/projected/a14dd71a-cfac-4f4e-9cbf-89599011d970-kube-api-access-zp44v\") pod \"apiserver-76f77b778f-87xcv\" (UID: \"a14dd71a-cfac-4f4e-9cbf-89599011d970\") " pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.014964 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-htb9k" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.025450 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drd98\" (UniqueName: \"kubernetes.io/projected/0b661fda-d14e-4491-896c-4d6812a638b5-kube-api-access-drd98\") pod \"control-plane-machine-set-operator-78cbb6b69f-xmz5p\" (UID: \"0b661fda-d14e-4491-896c-4d6812a638b5\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xmz5p" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.048613 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-nhtzb" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.056790 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.064101 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27"] Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.066064 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.067242 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/defd5be8-af50-4f87-a9a5-c166a9e3ce44-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jqwls\" (UID: \"defd5be8-af50-4f87-a9a5-c166a9e3ce44\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jqwls" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.078319 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.089689 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/defd5be8-af50-4f87-a9a5-c166a9e3ce44-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jqwls\" (UID: \"defd5be8-af50-4f87-a9a5-c166a9e3ce44\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jqwls" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.092445 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.109404 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-dsqjf"] Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.115395 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.133021 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbv74\" (UniqueName: \"kubernetes.io/projected/defd5be8-af50-4f87-a9a5-c166a9e3ce44-kube-api-access-fbv74\") pod \"openshift-apiserver-operator-796bbdcf4f-jqwls\" (UID: \"defd5be8-af50-4f87-a9a5-c166a9e3ce44\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jqwls" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.144202 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jvnk7"] Feb 23 00:09:43 crc kubenswrapper[4735]: W0223 00:09:43.182909 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf538d4f_1acd_4e61_9827_7430d1099138.slice/crio-36dbe32caa78c08a4de251b13d878b7abdd28dcb0ec2dcf42ff6beb905c768d7 WatchSource:0}: Error finding container 36dbe32caa78c08a4de251b13d878b7abdd28dcb0ec2dcf42ff6beb905c768d7: Status 404 returned error can't find the container with id 36dbe32caa78c08a4de251b13d878b7abdd28dcb0ec2dcf42ff6beb905c768d7 Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.195296 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lnz4f" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.220468 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-g5xff"] Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.223351 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jqwls" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.236809 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f7v9\" (UniqueName: \"kubernetes.io/projected/f4dd19db-78f4-45d0-a0e6-91505f2ade61-kube-api-access-8f7v9\") pod \"migrator-59844c95c7-9xsmt\" (UID: \"f4dd19db-78f4-45d0-a0e6-91505f2ade61\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9xsmt" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.236877 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/769cd336-e909-4164-89c9-e0874926fd3d-registry-tls\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.236896 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7c6f4b8-7812-4b99-8595-fb39bdf58a5b-config\") pod \"service-ca-operator-777779d784-kkmzn\" (UID: \"a7c6f4b8-7812-4b99-8595-fb39bdf58a5b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kkmzn" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.236916 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfb40dcb-d460-44f4-a461-fc423b7aed52-metrics-certs\") pod \"router-default-5444994796-f77lm\" (UID: \"bfb40dcb-d460-44f4-a461-fc423b7aed52\") " pod="openshift-ingress/router-default-5444994796-f77lm" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.236952 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf8f8be1-97ee-48aa-96cd-d88d2a29da73-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6ggwm\" (UID: \"cf8f8be1-97ee-48aa-96cd-d88d2a29da73\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6ggwm" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.236983 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/769cd336-e909-4164-89c9-e0874926fd3d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.237000 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3895d19d-15a0-442a-9156-726252aea7a1-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-wgw78\" (UID: \"3895d19d-15a0-442a-9156-726252aea7a1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wgw78" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.237081 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1bff1a80-3f8e-4297-9f56-701eea3c44f4-apiservice-cert\") pod \"packageserver-d55dfcdfc-k8wg7\" (UID: \"1bff1a80-3f8e-4297-9f56-701eea3c44f4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8wg7" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.237099 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/769cd336-e909-4164-89c9-e0874926fd3d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.237115 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e160ff4c-3954-4858-a65a-6ca9b9051e88-certs\") pod \"machine-config-server-8g4g7\" (UID: \"e160ff4c-3954-4858-a65a-6ca9b9051e88\") " pod="openshift-machine-config-operator/machine-config-server-8g4g7" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.237152 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbrxt\" (UniqueName: \"kubernetes.io/projected/e160ff4c-3954-4858-a65a-6ca9b9051e88-kube-api-access-wbrxt\") pod \"machine-config-server-8g4g7\" (UID: \"e160ff4c-3954-4858-a65a-6ca9b9051e88\") " pod="openshift-machine-config-operator/machine-config-server-8g4g7" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.237184 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/11c76755-d9b6-472e-b761-97f73562d736-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mvfx4\" (UID: \"11c76755-d9b6-472e-b761-97f73562d736\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvfx4" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.237210 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5cf4d35e-ea62-4c76-814b-c2224558d011-srv-cert\") pod \"olm-operator-6b444d44fb-x959d\" (UID: \"5cf4d35e-ea62-4c76-814b-c2224558d011\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x959d" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.237227 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/84d06127-3e73-44c4-840b-6bfed7e36f84-images\") pod \"machine-config-operator-74547568cd-kkhmq\" (UID: \"84d06127-3e73-44c4-840b-6bfed7e36f84\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kkhmq" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.237250 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11c76755-d9b6-472e-b761-97f73562d736-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mvfx4\" (UID: \"11c76755-d9b6-472e-b761-97f73562d736\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvfx4" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.237282 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4rf7\" (UniqueName: \"kubernetes.io/projected/6d167793-63cc-42a9-9986-0cbb627e2ee4-kube-api-access-h4rf7\") pod \"catalog-operator-68c6474976-l5546\" (UID: \"6d167793-63cc-42a9-9986-0cbb627e2ee4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l5546" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.237300 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrm86\" (UniqueName: \"kubernetes.io/projected/b9d77e51-e119-4f9e-b31f-90b7fec2c5be-kube-api-access-qrm86\") pod \"machine-config-controller-84d6567774-z7rvn\" (UID: \"b9d77e51-e119-4f9e-b31f-90b7fec2c5be\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7rvn" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.237337 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/33a3fbdd-f167-4093-8616-f64abea28e24-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-chd9d\" (UID: \"33a3fbdd-f167-4093-8616-f64abea28e24\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-chd9d" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.237354 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsphb\" (UniqueName: \"kubernetes.io/projected/bd601b38-6549-439d-b1de-58e5c5c5c769-kube-api-access-vsphb\") pod \"service-ca-9c57cc56f-h9hzt\" (UID: \"bd601b38-6549-439d-b1de-58e5c5c5c769\") " pod="openshift-service-ca/service-ca-9c57cc56f-h9hzt" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.237369 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3895d19d-15a0-442a-9156-726252aea7a1-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-wgw78\" (UID: \"3895d19d-15a0-442a-9156-726252aea7a1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wgw78" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.237429 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/fec1da66-feec-438b-a9d2-a8f36d8ef790-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-knx8j\" (UID: \"fec1da66-feec-438b-a9d2-a8f36d8ef790\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-knx8j" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.237449 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98bjf\" (UniqueName: \"kubernetes.io/projected/1bff1a80-3f8e-4297-9f56-701eea3c44f4-kube-api-access-98bjf\") pod \"packageserver-d55dfcdfc-k8wg7\" (UID: \"1bff1a80-3f8e-4297-9f56-701eea3c44f4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8wg7" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.237477 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fbea48c4-eae5-4fab-bd21-8c69eb791cb2-bound-sa-token\") pod \"ingress-operator-5b745b69d9-z22gj\" (UID: \"fbea48c4-eae5-4fab-bd21-8c69eb791cb2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z22gj" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.237550 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/769cd336-e909-4164-89c9-e0874926fd3d-registry-certificates\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.239752 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/1bff1a80-3f8e-4297-9f56-701eea3c44f4-tmpfs\") pod \"packageserver-d55dfcdfc-k8wg7\" (UID: \"1bff1a80-3f8e-4297-9f56-701eea3c44f4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8wg7" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.239793 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qhf9\" (UniqueName: \"kubernetes.io/projected/845be7d4-ea70-441f-8af2-c44870256906-kube-api-access-8qhf9\") pod \"collect-profiles-29530080-85k47\" (UID: \"845be7d4-ea70-441f-8af2-c44870256906\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530080-85k47" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.239833 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3895d19d-15a0-442a-9156-726252aea7a1-config\") pod \"kube-controller-manager-operator-78b949d7b-wgw78\" (UID: \"3895d19d-15a0-442a-9156-726252aea7a1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wgw78" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.240794 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/84d06127-3e73-44c4-840b-6bfed7e36f84-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kkhmq\" (UID: \"84d06127-3e73-44c4-840b-6bfed7e36f84\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kkhmq" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.240864 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf8f8be1-97ee-48aa-96cd-d88d2a29da73-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6ggwm\" (UID: \"cf8f8be1-97ee-48aa-96cd-d88d2a29da73\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6ggwm" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.240888 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/bfb40dcb-d460-44f4-a461-fc423b7aed52-stats-auth\") pod \"router-default-5444994796-f77lm\" (UID: \"bfb40dcb-d460-44f4-a461-fc423b7aed52\") " pod="openshift-ingress/router-default-5444994796-f77lm" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.240968 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fbea48c4-eae5-4fab-bd21-8c69eb791cb2-trusted-ca\") pod \"ingress-operator-5b745b69d9-z22gj\" (UID: \"fbea48c4-eae5-4fab-bd21-8c69eb791cb2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z22gj" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.240999 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ph5c\" (UniqueName: \"kubernetes.io/projected/bfb40dcb-d460-44f4-a461-fc423b7aed52-kube-api-access-4ph5c\") pod \"router-default-5444994796-f77lm\" (UID: \"bfb40dcb-d460-44f4-a461-fc423b7aed52\") " pod="openshift-ingress/router-default-5444994796-f77lm" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.241027 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/845be7d4-ea70-441f-8af2-c44870256906-config-volume\") pod \"collect-profiles-29530080-85k47\" (UID: \"845be7d4-ea70-441f-8af2-c44870256906\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530080-85k47" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.241051 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fbea48c4-eae5-4fab-bd21-8c69eb791cb2-metrics-tls\") pod \"ingress-operator-5b745b69d9-z22gj\" (UID: \"fbea48c4-eae5-4fab-bd21-8c69eb791cb2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z22gj" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.241078 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/845be7d4-ea70-441f-8af2-c44870256906-secret-volume\") pod \"collect-profiles-29530080-85k47\" (UID: \"845be7d4-ea70-441f-8af2-c44870256906\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530080-85k47" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.241712 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf8f8be1-97ee-48aa-96cd-d88d2a29da73-config\") pod \"kube-apiserver-operator-766d6c64bb-6ggwm\" (UID: \"cf8f8be1-97ee-48aa-96cd-d88d2a29da73\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6ggwm" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.241742 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6d167793-63cc-42a9-9986-0cbb627e2ee4-srv-cert\") pod \"catalog-operator-68c6474976-l5546\" (UID: \"6d167793-63cc-42a9-9986-0cbb627e2ee4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l5546" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.242181 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/769cd336-e909-4164-89c9-e0874926fd3d-trusted-ca\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.242225 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwtcm\" (UniqueName: \"kubernetes.io/projected/115b8c26-73db-4da1-a9b6-567a7ce7151e-kube-api-access-cwtcm\") pod \"multus-admission-controller-857f4d67dd-lftlx\" (UID: \"115b8c26-73db-4da1-a9b6-567a7ce7151e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lftlx" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.242245 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e160ff4c-3954-4858-a65a-6ca9b9051e88-node-bootstrap-token\") pod \"machine-config-server-8g4g7\" (UID: \"e160ff4c-3954-4858-a65a-6ca9b9051e88\") " pod="openshift-machine-config-operator/machine-config-server-8g4g7" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.242268 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/bd601b38-6549-439d-b1de-58e5c5c5c769-signing-key\") pod \"service-ca-9c57cc56f-h9hzt\" (UID: \"bd601b38-6549-439d-b1de-58e5c5c5c769\") " pod="openshift-service-ca/service-ca-9c57cc56f-h9hzt" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.242761 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbz8p\" (UniqueName: \"kubernetes.io/projected/5cf4d35e-ea62-4c76-814b-c2224558d011-kube-api-access-lbz8p\") pod \"olm-operator-6b444d44fb-x959d\" (UID: \"5cf4d35e-ea62-4c76-814b-c2224558d011\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x959d" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.242781 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msn77\" (UniqueName: \"kubernetes.io/projected/33a3fbdd-f167-4093-8616-f64abea28e24-kube-api-access-msn77\") pod \"cluster-samples-operator-665b6dd947-chd9d\" (UID: \"33a3fbdd-f167-4093-8616-f64abea28e24\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-chd9d" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.242942 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/bd601b38-6549-439d-b1de-58e5c5c5c769-signing-cabundle\") pod \"service-ca-9c57cc56f-h9hzt\" (UID: \"bd601b38-6549-439d-b1de-58e5c5c5c769\") " pod="openshift-service-ca/service-ca-9c57cc56f-h9hzt" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.243460 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b9d77e51-e119-4f9e-b31f-90b7fec2c5be-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-z7rvn\" (UID: \"b9d77e51-e119-4f9e-b31f-90b7fec2c5be\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7rvn" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.244320 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/84d06127-3e73-44c4-840b-6bfed7e36f84-proxy-tls\") pod \"machine-config-operator-74547568cd-kkhmq\" (UID: \"84d06127-3e73-44c4-840b-6bfed7e36f84\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kkhmq" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.244346 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrspb\" (UniqueName: \"kubernetes.io/projected/84d06127-3e73-44c4-840b-6bfed7e36f84-kube-api-access-rrspb\") pod \"machine-config-operator-74547568cd-kkhmq\" (UID: \"84d06127-3e73-44c4-840b-6bfed7e36f84\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kkhmq" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.244919 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11c76755-d9b6-472e-b761-97f73562d736-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mvfx4\" (UID: \"11c76755-d9b6-472e-b761-97f73562d736\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvfx4" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.244959 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/bfb40dcb-d460-44f4-a461-fc423b7aed52-default-certificate\") pod \"router-default-5444994796-f77lm\" (UID: \"bfb40dcb-d460-44f4-a461-fc423b7aed52\") " pod="openshift-ingress/router-default-5444994796-f77lm" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.245247 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/769cd336-e909-4164-89c9-e0874926fd3d-bound-sa-token\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.245320 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7k2x\" (UniqueName: \"kubernetes.io/projected/a7c6f4b8-7812-4b99-8595-fb39bdf58a5b-kube-api-access-d7k2x\") pod \"service-ca-operator-777779d784-kkmzn\" (UID: \"a7c6f4b8-7812-4b99-8595-fb39bdf58a5b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kkmzn" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.245423 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjtrd\" (UniqueName: \"kubernetes.io/projected/fbea48c4-eae5-4fab-bd21-8c69eb791cb2-kube-api-access-kjtrd\") pod \"ingress-operator-5b745b69d9-z22gj\" (UID: \"fbea48c4-eae5-4fab-bd21-8c69eb791cb2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z22gj" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.245519 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b6f8\" (UniqueName: \"kubernetes.io/projected/769cd336-e909-4164-89c9-e0874926fd3d-kube-api-access-4b6f8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.245547 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/115b8c26-73db-4da1-a9b6-567a7ce7151e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lftlx\" (UID: \"115b8c26-73db-4da1-a9b6-567a7ce7151e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lftlx" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.245654 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1bff1a80-3f8e-4297-9f56-701eea3c44f4-webhook-cert\") pod \"packageserver-d55dfcdfc-k8wg7\" (UID: \"1bff1a80-3f8e-4297-9f56-701eea3c44f4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8wg7" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.245815 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.245888 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5cf4d35e-ea62-4c76-814b-c2224558d011-profile-collector-cert\") pod \"olm-operator-6b444d44fb-x959d\" (UID: \"5cf4d35e-ea62-4c76-814b-c2224558d011\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x959d" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.246000 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7c6f4b8-7812-4b99-8595-fb39bdf58a5b-serving-cert\") pod \"service-ca-operator-777779d784-kkmzn\" (UID: \"a7c6f4b8-7812-4b99-8595-fb39bdf58a5b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kkmzn" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.246048 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2vsk\" (UniqueName: \"kubernetes.io/projected/fec1da66-feec-438b-a9d2-a8f36d8ef790-kube-api-access-b2vsk\") pod \"package-server-manager-789f6589d5-knx8j\" (UID: \"fec1da66-feec-438b-a9d2-a8f36d8ef790\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-knx8j" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.246088 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6d167793-63cc-42a9-9986-0cbb627e2ee4-profile-collector-cert\") pod \"catalog-operator-68c6474976-l5546\" (UID: \"6d167793-63cc-42a9-9986-0cbb627e2ee4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l5546" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.246106 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfb40dcb-d460-44f4-a461-fc423b7aed52-service-ca-bundle\") pod \"router-default-5444994796-f77lm\" (UID: \"bfb40dcb-d460-44f4-a461-fc423b7aed52\") " pod="openshift-ingress/router-default-5444994796-f77lm" Feb 23 00:09:43 crc kubenswrapper[4735]: E0223 00:09:43.246119 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:43.7461064 +0000 UTC m=+142.209652361 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.246141 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b9d77e51-e119-4f9e-b31f-90b7fec2c5be-proxy-tls\") pod \"machine-config-controller-84d6567774-z7rvn\" (UID: \"b9d77e51-e119-4f9e-b31f-90b7fec2c5be\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7rvn" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.288841 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xmz5p" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.318533 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2mld6"] Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.324822 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wtlds"] Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.346821 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.347005 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9af6d235-3d3f-49c9-9338-e7302ef9deae-csi-data-dir\") pod \"csi-hostpathplugin-vh89z\" (UID: \"9af6d235-3d3f-49c9-9338-e7302ef9deae\") " pod="hostpath-provisioner/csi-hostpathplugin-vh89z" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.347026 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slkrn\" (UniqueName: \"kubernetes.io/projected/3f7bc3cf-ab6c-413d-b5a2-9b970f5a5d85-kube-api-access-slkrn\") pod \"dns-default-5w4w6\" (UID: \"3f7bc3cf-ab6c-413d-b5a2-9b970f5a5d85\") " pod="openshift-dns/dns-default-5w4w6" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.347049 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/84d06127-3e73-44c4-840b-6bfed7e36f84-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kkhmq\" (UID: \"84d06127-3e73-44c4-840b-6bfed7e36f84\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kkhmq" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.347065 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf8f8be1-97ee-48aa-96cd-d88d2a29da73-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6ggwm\" (UID: \"cf8f8be1-97ee-48aa-96cd-d88d2a29da73\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6ggwm" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.347079 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/bfb40dcb-d460-44f4-a461-fc423b7aed52-stats-auth\") pod \"router-default-5444994796-f77lm\" (UID: \"bfb40dcb-d460-44f4-a461-fc423b7aed52\") " pod="openshift-ingress/router-default-5444994796-f77lm" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.347093 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fbea48c4-eae5-4fab-bd21-8c69eb791cb2-trusted-ca\") pod \"ingress-operator-5b745b69d9-z22gj\" (UID: \"fbea48c4-eae5-4fab-bd21-8c69eb791cb2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z22gj" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.347110 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ph5c\" (UniqueName: \"kubernetes.io/projected/bfb40dcb-d460-44f4-a461-fc423b7aed52-kube-api-access-4ph5c\") pod \"router-default-5444994796-f77lm\" (UID: \"bfb40dcb-d460-44f4-a461-fc423b7aed52\") " pod="openshift-ingress/router-default-5444994796-f77lm" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.347123 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/845be7d4-ea70-441f-8af2-c44870256906-config-volume\") pod \"collect-profiles-29530080-85k47\" (UID: \"845be7d4-ea70-441f-8af2-c44870256906\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530080-85k47" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.347138 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fbea48c4-eae5-4fab-bd21-8c69eb791cb2-metrics-tls\") pod \"ingress-operator-5b745b69d9-z22gj\" (UID: \"fbea48c4-eae5-4fab-bd21-8c69eb791cb2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z22gj" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.347156 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f7bc3cf-ab6c-413d-b5a2-9b970f5a5d85-config-volume\") pod \"dns-default-5w4w6\" (UID: \"3f7bc3cf-ab6c-413d-b5a2-9b970f5a5d85\") " pod="openshift-dns/dns-default-5w4w6" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.347172 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/845be7d4-ea70-441f-8af2-c44870256906-secret-volume\") pod \"collect-profiles-29530080-85k47\" (UID: \"845be7d4-ea70-441f-8af2-c44870256906\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530080-85k47" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.347224 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf8f8be1-97ee-48aa-96cd-d88d2a29da73-config\") pod \"kube-apiserver-operator-766d6c64bb-6ggwm\" (UID: \"cf8f8be1-97ee-48aa-96cd-d88d2a29da73\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6ggwm" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.347242 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6d167793-63cc-42a9-9986-0cbb627e2ee4-srv-cert\") pod \"catalog-operator-68c6474976-l5546\" (UID: \"6d167793-63cc-42a9-9986-0cbb627e2ee4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l5546" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.347257 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/769cd336-e909-4164-89c9-e0874926fd3d-trusted-ca\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.347272 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwtcm\" (UniqueName: \"kubernetes.io/projected/115b8c26-73db-4da1-a9b6-567a7ce7151e-kube-api-access-cwtcm\") pod \"multus-admission-controller-857f4d67dd-lftlx\" (UID: \"115b8c26-73db-4da1-a9b6-567a7ce7151e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lftlx" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.347285 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e160ff4c-3954-4858-a65a-6ca9b9051e88-node-bootstrap-token\") pod \"machine-config-server-8g4g7\" (UID: \"e160ff4c-3954-4858-a65a-6ca9b9051e88\") " pod="openshift-machine-config-operator/machine-config-server-8g4g7" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.347300 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/bd601b38-6549-439d-b1de-58e5c5c5c769-signing-key\") pod \"service-ca-9c57cc56f-h9hzt\" (UID: \"bd601b38-6549-439d-b1de-58e5c5c5c769\") " pod="openshift-service-ca/service-ca-9c57cc56f-h9hzt" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.347318 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d1190643-6cbb-4ecd-b76e-3fe0f8a7ec57-cert\") pod \"ingress-canary-42hhs\" (UID: \"d1190643-6cbb-4ecd-b76e-3fe0f8a7ec57\") " pod="openshift-ingress-canary/ingress-canary-42hhs" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.347333 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbz8p\" (UniqueName: \"kubernetes.io/projected/5cf4d35e-ea62-4c76-814b-c2224558d011-kube-api-access-lbz8p\") pod \"olm-operator-6b444d44fb-x959d\" (UID: \"5cf4d35e-ea62-4c76-814b-c2224558d011\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x959d" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.347349 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msn77\" (UniqueName: \"kubernetes.io/projected/33a3fbdd-f167-4093-8616-f64abea28e24-kube-api-access-msn77\") pod \"cluster-samples-operator-665b6dd947-chd9d\" (UID: \"33a3fbdd-f167-4093-8616-f64abea28e24\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-chd9d" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.347364 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/bd601b38-6549-439d-b1de-58e5c5c5c769-signing-cabundle\") pod \"service-ca-9c57cc56f-h9hzt\" (UID: \"bd601b38-6549-439d-b1de-58e5c5c5c769\") " pod="openshift-service-ca/service-ca-9c57cc56f-h9hzt" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.347389 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b9d77e51-e119-4f9e-b31f-90b7fec2c5be-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-z7rvn\" (UID: \"b9d77e51-e119-4f9e-b31f-90b7fec2c5be\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7rvn" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.347413 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/84d06127-3e73-44c4-840b-6bfed7e36f84-proxy-tls\") pod \"machine-config-operator-74547568cd-kkhmq\" (UID: \"84d06127-3e73-44c4-840b-6bfed7e36f84\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kkhmq" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.347427 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrspb\" (UniqueName: \"kubernetes.io/projected/84d06127-3e73-44c4-840b-6bfed7e36f84-kube-api-access-rrspb\") pod \"machine-config-operator-74547568cd-kkhmq\" (UID: \"84d06127-3e73-44c4-840b-6bfed7e36f84\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kkhmq" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.347450 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11c76755-d9b6-472e-b761-97f73562d736-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mvfx4\" (UID: \"11c76755-d9b6-472e-b761-97f73562d736\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvfx4" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.347465 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/bfb40dcb-d460-44f4-a461-fc423b7aed52-default-certificate\") pod \"router-default-5444994796-f77lm\" (UID: \"bfb40dcb-d460-44f4-a461-fc423b7aed52\") " pod="openshift-ingress/router-default-5444994796-f77lm" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.347494 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/769cd336-e909-4164-89c9-e0874926fd3d-bound-sa-token\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.347510 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7k2x\" (UniqueName: \"kubernetes.io/projected/a7c6f4b8-7812-4b99-8595-fb39bdf58a5b-kube-api-access-d7k2x\") pod \"service-ca-operator-777779d784-kkmzn\" (UID: \"a7c6f4b8-7812-4b99-8595-fb39bdf58a5b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kkmzn" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.347524 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjtrd\" (UniqueName: \"kubernetes.io/projected/fbea48c4-eae5-4fab-bd21-8c69eb791cb2-kube-api-access-kjtrd\") pod \"ingress-operator-5b745b69d9-z22gj\" (UID: \"fbea48c4-eae5-4fab-bd21-8c69eb791cb2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z22gj" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.347538 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b6f8\" (UniqueName: \"kubernetes.io/projected/769cd336-e909-4164-89c9-e0874926fd3d-kube-api-access-4b6f8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.347552 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/115b8c26-73db-4da1-a9b6-567a7ce7151e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lftlx\" (UID: \"115b8c26-73db-4da1-a9b6-567a7ce7151e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lftlx" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.347566 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1bff1a80-3f8e-4297-9f56-701eea3c44f4-webhook-cert\") pod \"packageserver-d55dfcdfc-k8wg7\" (UID: \"1bff1a80-3f8e-4297-9f56-701eea3c44f4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8wg7" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.347587 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5cf4d35e-ea62-4c76-814b-c2224558d011-profile-collector-cert\") pod \"olm-operator-6b444d44fb-x959d\" (UID: \"5cf4d35e-ea62-4c76-814b-c2224558d011\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x959d" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.347604 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7c6f4b8-7812-4b99-8595-fb39bdf58a5b-serving-cert\") pod \"service-ca-operator-777779d784-kkmzn\" (UID: \"a7c6f4b8-7812-4b99-8595-fb39bdf58a5b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kkmzn" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.347618 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2vsk\" (UniqueName: \"kubernetes.io/projected/fec1da66-feec-438b-a9d2-a8f36d8ef790-kube-api-access-b2vsk\") pod \"package-server-manager-789f6589d5-knx8j\" (UID: \"fec1da66-feec-438b-a9d2-a8f36d8ef790\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-knx8j" Feb 23 00:09:43 crc kubenswrapper[4735]: E0223 00:09:43.347917 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:43.847731405 +0000 UTC m=+142.311277376 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.348012 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6d167793-63cc-42a9-9986-0cbb627e2ee4-profile-collector-cert\") pod \"catalog-operator-68c6474976-l5546\" (UID: \"6d167793-63cc-42a9-9986-0cbb627e2ee4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l5546" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.348036 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfb40dcb-d460-44f4-a461-fc423b7aed52-service-ca-bundle\") pod \"router-default-5444994796-f77lm\" (UID: \"bfb40dcb-d460-44f4-a461-fc423b7aed52\") " pod="openshift-ingress/router-default-5444994796-f77lm" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.348080 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b9d77e51-e119-4f9e-b31f-90b7fec2c5be-proxy-tls\") pod \"machine-config-controller-84d6567774-z7rvn\" (UID: \"b9d77e51-e119-4f9e-b31f-90b7fec2c5be\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7rvn" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.348100 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9af6d235-3d3f-49c9-9338-e7302ef9deae-mountpoint-dir\") pod \"csi-hostpathplugin-vh89z\" (UID: \"9af6d235-3d3f-49c9-9338-e7302ef9deae\") " pod="hostpath-provisioner/csi-hostpathplugin-vh89z" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.348120 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9af6d235-3d3f-49c9-9338-e7302ef9deae-plugins-dir\") pod \"csi-hostpathplugin-vh89z\" (UID: \"9af6d235-3d3f-49c9-9338-e7302ef9deae\") " pod="hostpath-provisioner/csi-hostpathplugin-vh89z" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.348172 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f7v9\" (UniqueName: \"kubernetes.io/projected/f4dd19db-78f4-45d0-a0e6-91505f2ade61-kube-api-access-8f7v9\") pod \"migrator-59844c95c7-9xsmt\" (UID: \"f4dd19db-78f4-45d0-a0e6-91505f2ade61\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9xsmt" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.348204 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/769cd336-e909-4164-89c9-e0874926fd3d-registry-tls\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.348246 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7c6f4b8-7812-4b99-8595-fb39bdf58a5b-config\") pod \"service-ca-operator-777779d784-kkmzn\" (UID: \"a7c6f4b8-7812-4b99-8595-fb39bdf58a5b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kkmzn" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.348273 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfb40dcb-d460-44f4-a461-fc423b7aed52-metrics-certs\") pod \"router-default-5444994796-f77lm\" (UID: \"bfb40dcb-d460-44f4-a461-fc423b7aed52\") " pod="openshift-ingress/router-default-5444994796-f77lm" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.348312 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf8f8be1-97ee-48aa-96cd-d88d2a29da73-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6ggwm\" (UID: \"cf8f8be1-97ee-48aa-96cd-d88d2a29da73\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6ggwm" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.348346 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/769cd336-e909-4164-89c9-e0874926fd3d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.348826 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3895d19d-15a0-442a-9156-726252aea7a1-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-wgw78\" (UID: \"3895d19d-15a0-442a-9156-726252aea7a1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wgw78" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.349231 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1bff1a80-3f8e-4297-9f56-701eea3c44f4-apiservice-cert\") pod \"packageserver-d55dfcdfc-k8wg7\" (UID: \"1bff1a80-3f8e-4297-9f56-701eea3c44f4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8wg7" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.349361 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fbea48c4-eae5-4fab-bd21-8c69eb791cb2-trusted-ca\") pod \"ingress-operator-5b745b69d9-z22gj\" (UID: \"fbea48c4-eae5-4fab-bd21-8c69eb791cb2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z22gj" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.349709 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/84d06127-3e73-44c4-840b-6bfed7e36f84-auth-proxy-config\") pod \"machine-config-operator-74547568cd-kkhmq\" (UID: \"84d06127-3e73-44c4-840b-6bfed7e36f84\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kkhmq" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.350070 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/769cd336-e909-4164-89c9-e0874926fd3d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.350609 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/845be7d4-ea70-441f-8af2-c44870256906-config-volume\") pod \"collect-profiles-29530080-85k47\" (UID: \"845be7d4-ea70-441f-8af2-c44870256906\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530080-85k47" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.350936 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11c76755-d9b6-472e-b761-97f73562d736-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mvfx4\" (UID: \"11c76755-d9b6-472e-b761-97f73562d736\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvfx4" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.351475 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b9d77e51-e119-4f9e-b31f-90b7fec2c5be-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-z7rvn\" (UID: \"b9d77e51-e119-4f9e-b31f-90b7fec2c5be\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7rvn" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.352387 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/769cd336-e909-4164-89c9-e0874926fd3d-ca-trust-extracted\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.352864 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/769cd336-e909-4164-89c9-e0874926fd3d-trusted-ca\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.353001 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfb40dcb-d460-44f4-a461-fc423b7aed52-service-ca-bundle\") pod \"router-default-5444994796-f77lm\" (UID: \"bfb40dcb-d460-44f4-a461-fc423b7aed52\") " pod="openshift-ingress/router-default-5444994796-f77lm" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.356182 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/84d06127-3e73-44c4-840b-6bfed7e36f84-proxy-tls\") pod \"machine-config-operator-74547568cd-kkhmq\" (UID: \"84d06127-3e73-44c4-840b-6bfed7e36f84\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kkhmq" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.357260 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f7bc3cf-ab6c-413d-b5a2-9b970f5a5d85-metrics-tls\") pod \"dns-default-5w4w6\" (UID: \"3f7bc3cf-ab6c-413d-b5a2-9b970f5a5d85\") " pod="openshift-dns/dns-default-5w4w6" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.357322 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e160ff4c-3954-4858-a65a-6ca9b9051e88-certs\") pod \"machine-config-server-8g4g7\" (UID: \"e160ff4c-3954-4858-a65a-6ca9b9051e88\") " pod="openshift-machine-config-operator/machine-config-server-8g4g7" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.357348 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbrxt\" (UniqueName: \"kubernetes.io/projected/e160ff4c-3954-4858-a65a-6ca9b9051e88-kube-api-access-wbrxt\") pod \"machine-config-server-8g4g7\" (UID: \"e160ff4c-3954-4858-a65a-6ca9b9051e88\") " pod="openshift-machine-config-operator/machine-config-server-8g4g7" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.357552 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/11c76755-d9b6-472e-b761-97f73562d736-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mvfx4\" (UID: \"11c76755-d9b6-472e-b761-97f73562d736\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvfx4" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.357697 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5cf4d35e-ea62-4c76-814b-c2224558d011-srv-cert\") pod \"olm-operator-6b444d44fb-x959d\" (UID: \"5cf4d35e-ea62-4c76-814b-c2224558d011\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x959d" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.357906 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11c76755-d9b6-472e-b761-97f73562d736-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mvfx4\" (UID: \"11c76755-d9b6-472e-b761-97f73562d736\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvfx4" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.357991 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/84d06127-3e73-44c4-840b-6bfed7e36f84-images\") pod \"machine-config-operator-74547568cd-kkhmq\" (UID: \"84d06127-3e73-44c4-840b-6bfed7e36f84\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kkhmq" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.358567 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fbea48c4-eae5-4fab-bd21-8c69eb791cb2-metrics-tls\") pod \"ingress-operator-5b745b69d9-z22gj\" (UID: \"fbea48c4-eae5-4fab-bd21-8c69eb791cb2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z22gj" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.358919 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf8f8be1-97ee-48aa-96cd-d88d2a29da73-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6ggwm\" (UID: \"cf8f8be1-97ee-48aa-96cd-d88d2a29da73\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6ggwm" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.359110 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1bff1a80-3f8e-4297-9f56-701eea3c44f4-webhook-cert\") pod \"packageserver-d55dfcdfc-k8wg7\" (UID: \"1bff1a80-3f8e-4297-9f56-701eea3c44f4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8wg7" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.360119 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4rf7\" (UniqueName: \"kubernetes.io/projected/6d167793-63cc-42a9-9986-0cbb627e2ee4-kube-api-access-h4rf7\") pod \"catalog-operator-68c6474976-l5546\" (UID: \"6d167793-63cc-42a9-9986-0cbb627e2ee4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l5546" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.360197 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9af6d235-3d3f-49c9-9338-e7302ef9deae-registration-dir\") pod \"csi-hostpathplugin-vh89z\" (UID: \"9af6d235-3d3f-49c9-9338-e7302ef9deae\") " pod="hostpath-provisioner/csi-hostpathplugin-vh89z" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.360277 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrm86\" (UniqueName: \"kubernetes.io/projected/b9d77e51-e119-4f9e-b31f-90b7fec2c5be-kube-api-access-qrm86\") pod \"machine-config-controller-84d6567774-z7rvn\" (UID: \"b9d77e51-e119-4f9e-b31f-90b7fec2c5be\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7rvn" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.360440 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/bd601b38-6549-439d-b1de-58e5c5c5c769-signing-cabundle\") pod \"service-ca-9c57cc56f-h9hzt\" (UID: \"bd601b38-6549-439d-b1de-58e5c5c5c769\") " pod="openshift-service-ca/service-ca-9c57cc56f-h9hzt" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.361106 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/84d06127-3e73-44c4-840b-6bfed7e36f84-images\") pod \"machine-config-operator-74547568cd-kkhmq\" (UID: \"84d06127-3e73-44c4-840b-6bfed7e36f84\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kkhmq" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.361421 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7c6f4b8-7812-4b99-8595-fb39bdf58a5b-config\") pod \"service-ca-operator-777779d784-kkmzn\" (UID: \"a7c6f4b8-7812-4b99-8595-fb39bdf58a5b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kkmzn" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.361634 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf8f8be1-97ee-48aa-96cd-d88d2a29da73-config\") pod \"kube-apiserver-operator-766d6c64bb-6ggwm\" (UID: \"cf8f8be1-97ee-48aa-96cd-d88d2a29da73\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6ggwm" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.361725 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9af6d235-3d3f-49c9-9338-e7302ef9deae-socket-dir\") pod \"csi-hostpathplugin-vh89z\" (UID: \"9af6d235-3d3f-49c9-9338-e7302ef9deae\") " pod="hostpath-provisioner/csi-hostpathplugin-vh89z" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.361906 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/33a3fbdd-f167-4093-8616-f64abea28e24-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-chd9d\" (UID: \"33a3fbdd-f167-4093-8616-f64abea28e24\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-chd9d" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.361974 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsphb\" (UniqueName: \"kubernetes.io/projected/bd601b38-6549-439d-b1de-58e5c5c5c769-kube-api-access-vsphb\") pod \"service-ca-9c57cc56f-h9hzt\" (UID: \"bd601b38-6549-439d-b1de-58e5c5c5c769\") " pod="openshift-service-ca/service-ca-9c57cc56f-h9hzt" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.362088 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3895d19d-15a0-442a-9156-726252aea7a1-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-wgw78\" (UID: \"3895d19d-15a0-442a-9156-726252aea7a1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wgw78" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.363033 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/fec1da66-feec-438b-a9d2-a8f36d8ef790-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-knx8j\" (UID: \"fec1da66-feec-438b-a9d2-a8f36d8ef790\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-knx8j" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.363108 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98bjf\" (UniqueName: \"kubernetes.io/projected/1bff1a80-3f8e-4297-9f56-701eea3c44f4-kube-api-access-98bjf\") pod \"packageserver-d55dfcdfc-k8wg7\" (UID: \"1bff1a80-3f8e-4297-9f56-701eea3c44f4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8wg7" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.363185 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fbea48c4-eae5-4fab-bd21-8c69eb791cb2-bound-sa-token\") pod \"ingress-operator-5b745b69d9-z22gj\" (UID: \"fbea48c4-eae5-4fab-bd21-8c69eb791cb2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z22gj" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.363281 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/769cd336-e909-4164-89c9-e0874926fd3d-registry-certificates\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.363375 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/1bff1a80-3f8e-4297-9f56-701eea3c44f4-tmpfs\") pod \"packageserver-d55dfcdfc-k8wg7\" (UID: \"1bff1a80-3f8e-4297-9f56-701eea3c44f4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8wg7" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.363448 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p98l2\" (UniqueName: \"kubernetes.io/projected/9af6d235-3d3f-49c9-9338-e7302ef9deae-kube-api-access-p98l2\") pod \"csi-hostpathplugin-vh89z\" (UID: \"9af6d235-3d3f-49c9-9338-e7302ef9deae\") " pod="hostpath-provisioner/csi-hostpathplugin-vh89z" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.363517 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhm8l\" (UniqueName: \"kubernetes.io/projected/d1190643-6cbb-4ecd-b76e-3fe0f8a7ec57-kube-api-access-nhm8l\") pod \"ingress-canary-42hhs\" (UID: \"d1190643-6cbb-4ecd-b76e-3fe0f8a7ec57\") " pod="openshift-ingress-canary/ingress-canary-42hhs" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.363587 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qhf9\" (UniqueName: \"kubernetes.io/projected/845be7d4-ea70-441f-8af2-c44870256906-kube-api-access-8qhf9\") pod \"collect-profiles-29530080-85k47\" (UID: \"845be7d4-ea70-441f-8af2-c44870256906\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530080-85k47" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.363653 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3895d19d-15a0-442a-9156-726252aea7a1-config\") pod \"kube-controller-manager-operator-78b949d7b-wgw78\" (UID: \"3895d19d-15a0-442a-9156-726252aea7a1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wgw78" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.363827 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/769cd336-e909-4164-89c9-e0874926fd3d-installation-pull-secrets\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.364237 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e160ff4c-3954-4858-a65a-6ca9b9051e88-node-bootstrap-token\") pod \"machine-config-server-8g4g7\" (UID: \"e160ff4c-3954-4858-a65a-6ca9b9051e88\") " pod="openshift-machine-config-operator/machine-config-server-8g4g7" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.364304 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5cf4d35e-ea62-4c76-814b-c2224558d011-profile-collector-cert\") pod \"olm-operator-6b444d44fb-x959d\" (UID: \"5cf4d35e-ea62-4c76-814b-c2224558d011\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x959d" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.364473 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3895d19d-15a0-442a-9156-726252aea7a1-config\") pod \"kube-controller-manager-operator-78b949d7b-wgw78\" (UID: \"3895d19d-15a0-442a-9156-726252aea7a1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wgw78" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.365512 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6d167793-63cc-42a9-9986-0cbb627e2ee4-profile-collector-cert\") pod \"catalog-operator-68c6474976-l5546\" (UID: \"6d167793-63cc-42a9-9986-0cbb627e2ee4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l5546" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.365516 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/1bff1a80-3f8e-4297-9f56-701eea3c44f4-tmpfs\") pod \"packageserver-d55dfcdfc-k8wg7\" (UID: \"1bff1a80-3f8e-4297-9f56-701eea3c44f4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8wg7" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.365784 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1bff1a80-3f8e-4297-9f56-701eea3c44f4-apiservice-cert\") pod \"packageserver-d55dfcdfc-k8wg7\" (UID: \"1bff1a80-3f8e-4297-9f56-701eea3c44f4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8wg7" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.365889 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/115b8c26-73db-4da1-a9b6-567a7ce7151e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lftlx\" (UID: \"115b8c26-73db-4da1-a9b6-567a7ce7151e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lftlx" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.366188 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/769cd336-e909-4164-89c9-e0874926fd3d-registry-certificates\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.366316 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5cf4d35e-ea62-4c76-814b-c2224558d011-srv-cert\") pod \"olm-operator-6b444d44fb-x959d\" (UID: \"5cf4d35e-ea62-4c76-814b-c2224558d011\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x959d" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.366674 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7c6f4b8-7812-4b99-8595-fb39bdf58a5b-serving-cert\") pod \"service-ca-operator-777779d784-kkmzn\" (UID: \"a7c6f4b8-7812-4b99-8595-fb39bdf58a5b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kkmzn" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.367198 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b9d77e51-e119-4f9e-b31f-90b7fec2c5be-proxy-tls\") pod \"machine-config-controller-84d6567774-z7rvn\" (UID: \"b9d77e51-e119-4f9e-b31f-90b7fec2c5be\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7rvn" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.367429 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/bfb40dcb-d460-44f4-a461-fc423b7aed52-stats-auth\") pod \"router-default-5444994796-f77lm\" (UID: \"bfb40dcb-d460-44f4-a461-fc423b7aed52\") " pod="openshift-ingress/router-default-5444994796-f77lm" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.370037 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11c76755-d9b6-472e-b761-97f73562d736-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mvfx4\" (UID: \"11c76755-d9b6-472e-b761-97f73562d736\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvfx4" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.372592 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/bfb40dcb-d460-44f4-a461-fc423b7aed52-default-certificate\") pod \"router-default-5444994796-f77lm\" (UID: \"bfb40dcb-d460-44f4-a461-fc423b7aed52\") " pod="openshift-ingress/router-default-5444994796-f77lm" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.372640 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3895d19d-15a0-442a-9156-726252aea7a1-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-wgw78\" (UID: \"3895d19d-15a0-442a-9156-726252aea7a1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wgw78" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.373264 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfb40dcb-d460-44f4-a461-fc423b7aed52-metrics-certs\") pod \"router-default-5444994796-f77lm\" (UID: \"bfb40dcb-d460-44f4-a461-fc423b7aed52\") " pod="openshift-ingress/router-default-5444994796-f77lm" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.375987 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e160ff4c-3954-4858-a65a-6ca9b9051e88-certs\") pod \"machine-config-server-8g4g7\" (UID: \"e160ff4c-3954-4858-a65a-6ca9b9051e88\") " pod="openshift-machine-config-operator/machine-config-server-8g4g7" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.376330 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/769cd336-e909-4164-89c9-e0874926fd3d-registry-tls\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.376369 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/fec1da66-feec-438b-a9d2-a8f36d8ef790-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-knx8j\" (UID: \"fec1da66-feec-438b-a9d2-a8f36d8ef790\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-knx8j" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.376562 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6d167793-63cc-42a9-9986-0cbb627e2ee4-srv-cert\") pod \"catalog-operator-68c6474976-l5546\" (UID: \"6d167793-63cc-42a9-9986-0cbb627e2ee4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l5546" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.376781 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/33a3fbdd-f167-4093-8616-f64abea28e24-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-chd9d\" (UID: \"33a3fbdd-f167-4093-8616-f64abea28e24\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-chd9d" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.376865 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/bd601b38-6549-439d-b1de-58e5c5c5c769-signing-key\") pod \"service-ca-9c57cc56f-h9hzt\" (UID: \"bd601b38-6549-439d-b1de-58e5c5c5c769\") " pod="openshift-service-ca/service-ca-9c57cc56f-h9hzt" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.377656 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/845be7d4-ea70-441f-8af2-c44870256906-secret-volume\") pod \"collect-profiles-29530080-85k47\" (UID: \"845be7d4-ea70-441f-8af2-c44870256906\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530080-85k47" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.387148 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2vsk\" (UniqueName: \"kubernetes.io/projected/fec1da66-feec-438b-a9d2-a8f36d8ef790-kube-api-access-b2vsk\") pod \"package-server-manager-789f6589d5-knx8j\" (UID: \"fec1da66-feec-438b-a9d2-a8f36d8ef790\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-knx8j" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.409886 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/769cd336-e909-4164-89c9-e0874926fd3d-bound-sa-token\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:43 crc kubenswrapper[4735]: W0223 00:09:43.415981 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5732e921_c103_4eda_9705_afec618471c9.slice/crio-e31e0d06e18af2ae5a96bfa45da454f0c07f62687accb6d5074ba4a1746c5821 WatchSource:0}: Error finding container e31e0d06e18af2ae5a96bfa45da454f0c07f62687accb6d5074ba4a1746c5821: Status 404 returned error can't find the container with id e31e0d06e18af2ae5a96bfa45da454f0c07f62687accb6d5074ba4a1746c5821 Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.430830 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ph5c\" (UniqueName: \"kubernetes.io/projected/bfb40dcb-d460-44f4-a461-fc423b7aed52-kube-api-access-4ph5c\") pod \"router-default-5444994796-f77lm\" (UID: \"bfb40dcb-d460-44f4-a461-fc423b7aed52\") " pod="openshift-ingress/router-default-5444994796-f77lm" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.444663 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-knx8j" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.445920 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qzhgh"] Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.452584 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrspb\" (UniqueName: \"kubernetes.io/projected/84d06127-3e73-44c4-840b-6bfed7e36f84-kube-api-access-rrspb\") pod \"machine-config-operator-74547568cd-kkhmq\" (UID: \"84d06127-3e73-44c4-840b-6bfed7e36f84\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kkhmq" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.463266 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xgbf2"] Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.480689 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7k2x\" (UniqueName: \"kubernetes.io/projected/a7c6f4b8-7812-4b99-8595-fb39bdf58a5b-kube-api-access-d7k2x\") pod \"service-ca-operator-777779d784-kkmzn\" (UID: \"a7c6f4b8-7812-4b99-8595-fb39bdf58a5b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kkmzn" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.482073 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.482112 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9af6d235-3d3f-49c9-9338-e7302ef9deae-mountpoint-dir\") pod \"csi-hostpathplugin-vh89z\" (UID: \"9af6d235-3d3f-49c9-9338-e7302ef9deae\") " pod="hostpath-provisioner/csi-hostpathplugin-vh89z" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.482138 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9af6d235-3d3f-49c9-9338-e7302ef9deae-plugins-dir\") pod \"csi-hostpathplugin-vh89z\" (UID: \"9af6d235-3d3f-49c9-9338-e7302ef9deae\") " pod="hostpath-provisioner/csi-hostpathplugin-vh89z" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.482207 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f7bc3cf-ab6c-413d-b5a2-9b970f5a5d85-metrics-tls\") pod \"dns-default-5w4w6\" (UID: \"3f7bc3cf-ab6c-413d-b5a2-9b970f5a5d85\") " pod="openshift-dns/dns-default-5w4w6" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.482264 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9af6d235-3d3f-49c9-9338-e7302ef9deae-registration-dir\") pod \"csi-hostpathplugin-vh89z\" (UID: \"9af6d235-3d3f-49c9-9338-e7302ef9deae\") " pod="hostpath-provisioner/csi-hostpathplugin-vh89z" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.482291 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9af6d235-3d3f-49c9-9338-e7302ef9deae-socket-dir\") pod \"csi-hostpathplugin-vh89z\" (UID: \"9af6d235-3d3f-49c9-9338-e7302ef9deae\") " pod="hostpath-provisioner/csi-hostpathplugin-vh89z" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.482362 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p98l2\" (UniqueName: \"kubernetes.io/projected/9af6d235-3d3f-49c9-9338-e7302ef9deae-kube-api-access-p98l2\") pod \"csi-hostpathplugin-vh89z\" (UID: \"9af6d235-3d3f-49c9-9338-e7302ef9deae\") " pod="hostpath-provisioner/csi-hostpathplugin-vh89z" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.482371 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9af6d235-3d3f-49c9-9338-e7302ef9deae-mountpoint-dir\") pod \"csi-hostpathplugin-vh89z\" (UID: \"9af6d235-3d3f-49c9-9338-e7302ef9deae\") " pod="hostpath-provisioner/csi-hostpathplugin-vh89z" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.482386 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhm8l\" (UniqueName: \"kubernetes.io/projected/d1190643-6cbb-4ecd-b76e-3fe0f8a7ec57-kube-api-access-nhm8l\") pod \"ingress-canary-42hhs\" (UID: \"d1190643-6cbb-4ecd-b76e-3fe0f8a7ec57\") " pod="openshift-ingress-canary/ingress-canary-42hhs" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.482485 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slkrn\" (UniqueName: \"kubernetes.io/projected/3f7bc3cf-ab6c-413d-b5a2-9b970f5a5d85-kube-api-access-slkrn\") pod \"dns-default-5w4w6\" (UID: \"3f7bc3cf-ab6c-413d-b5a2-9b970f5a5d85\") " pod="openshift-dns/dns-default-5w4w6" Feb 23 00:09:43 crc kubenswrapper[4735]: E0223 00:09:43.482888 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:43.982868379 +0000 UTC m=+142.446414350 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.485101 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9af6d235-3d3f-49c9-9338-e7302ef9deae-plugins-dir\") pod \"csi-hostpathplugin-vh89z\" (UID: \"9af6d235-3d3f-49c9-9338-e7302ef9deae\") " pod="hostpath-provisioner/csi-hostpathplugin-vh89z" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.486101 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9af6d235-3d3f-49c9-9338-e7302ef9deae-csi-data-dir\") pod \"csi-hostpathplugin-vh89z\" (UID: \"9af6d235-3d3f-49c9-9338-e7302ef9deae\") " pod="hostpath-provisioner/csi-hostpathplugin-vh89z" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.486228 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f7bc3cf-ab6c-413d-b5a2-9b970f5a5d85-config-volume\") pod \"dns-default-5w4w6\" (UID: \"3f7bc3cf-ab6c-413d-b5a2-9b970f5a5d85\") " pod="openshift-dns/dns-default-5w4w6" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.486726 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d1190643-6cbb-4ecd-b76e-3fe0f8a7ec57-cert\") pod \"ingress-canary-42hhs\" (UID: \"d1190643-6cbb-4ecd-b76e-3fe0f8a7ec57\") " pod="openshift-ingress-canary/ingress-canary-42hhs" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.489492 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6xqc"] Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.490500 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9af6d235-3d3f-49c9-9338-e7302ef9deae-csi-data-dir\") pod \"csi-hostpathplugin-vh89z\" (UID: \"9af6d235-3d3f-49c9-9338-e7302ef9deae\") " pod="hostpath-provisioner/csi-hostpathplugin-vh89z" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.491173 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3f7bc3cf-ab6c-413d-b5a2-9b970f5a5d85-config-volume\") pod \"dns-default-5w4w6\" (UID: \"3f7bc3cf-ab6c-413d-b5a2-9b970f5a5d85\") " pod="openshift-dns/dns-default-5w4w6" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.491855 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9af6d235-3d3f-49c9-9338-e7302ef9deae-registration-dir\") pod \"csi-hostpathplugin-vh89z\" (UID: \"9af6d235-3d3f-49c9-9338-e7302ef9deae\") " pod="hostpath-provisioner/csi-hostpathplugin-vh89z" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.492059 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9af6d235-3d3f-49c9-9338-e7302ef9deae-socket-dir\") pod \"csi-hostpathplugin-vh89z\" (UID: \"9af6d235-3d3f-49c9-9338-e7302ef9deae\") " pod="hostpath-provisioner/csi-hostpathplugin-vh89z" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.494224 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3f7bc3cf-ab6c-413d-b5a2-9b970f5a5d85-metrics-tls\") pod \"dns-default-5w4w6\" (UID: \"3f7bc3cf-ab6c-413d-b5a2-9b970f5a5d85\") " pod="openshift-dns/dns-default-5w4w6" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.495800 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d1190643-6cbb-4ecd-b76e-3fe0f8a7ec57-cert\") pod \"ingress-canary-42hhs\" (UID: \"d1190643-6cbb-4ecd-b76e-3fe0f8a7ec57\") " pod="openshift-ingress-canary/ingress-canary-42hhs" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.503278 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f7v9\" (UniqueName: \"kubernetes.io/projected/f4dd19db-78f4-45d0-a0e6-91505f2ade61-kube-api-access-8f7v9\") pod \"migrator-59844c95c7-9xsmt\" (UID: \"f4dd19db-78f4-45d0-a0e6-91505f2ade61\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9xsmt" Feb 23 00:09:43 crc kubenswrapper[4735]: W0223 00:09:43.506433 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25de1778_6e9d_4158_9aa4_02e1607d7f45.slice/crio-f7fd8bc945c60242fe77809b02a48d625e071f3f0ae869930536b2e3fa0da68f WatchSource:0}: Error finding container f7fd8bc945c60242fe77809b02a48d625e071f3f0ae869930536b2e3fa0da68f: Status 404 returned error can't find the container with id f7fd8bc945c60242fe77809b02a48d625e071f3f0ae869930536b2e3fa0da68f Feb 23 00:09:43 crc kubenswrapper[4735]: W0223 00:09:43.510870 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bab8f58_226d_43fd_8a33_668c4e060bfb.slice/crio-2a6b6c8f5f8efcf07119369109d3526fd62b6d542688704d4663fdbebb09cb1f WatchSource:0}: Error finding container 2a6b6c8f5f8efcf07119369109d3526fd62b6d542688704d4663fdbebb09cb1f: Status 404 returned error can't find the container with id 2a6b6c8f5f8efcf07119369109d3526fd62b6d542688704d4663fdbebb09cb1f Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.513962 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwtcm\" (UniqueName: \"kubernetes.io/projected/115b8c26-73db-4da1-a9b6-567a7ce7151e-kube-api-access-cwtcm\") pod \"multus-admission-controller-857f4d67dd-lftlx\" (UID: \"115b8c26-73db-4da1-a9b6-567a7ce7151e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lftlx" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.554719 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b6f8\" (UniqueName: \"kubernetes.io/projected/769cd336-e909-4164-89c9-e0874926fd3d-kube-api-access-4b6f8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.571337 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3895d19d-15a0-442a-9156-726252aea7a1-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-wgw78\" (UID: \"3895d19d-15a0-442a-9156-726252aea7a1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wgw78" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.586405 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29530080-8vgj6"] Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.587658 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:43 crc kubenswrapper[4735]: E0223 00:09:43.588234 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:44.088215064 +0000 UTC m=+142.551761035 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.609249 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbrxt\" (UniqueName: \"kubernetes.io/projected/e160ff4c-3954-4858-a65a-6ca9b9051e88-kube-api-access-wbrxt\") pod \"machine-config-server-8g4g7\" (UID: \"e160ff4c-3954-4858-a65a-6ca9b9051e88\") " pod="openshift-machine-config-operator/machine-config-server-8g4g7" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.614379 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/11c76755-d9b6-472e-b761-97f73562d736-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mvfx4\" (UID: \"11c76755-d9b6-472e-b761-97f73562d736\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvfx4" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.619411 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9xsmt" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.620769 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fpbc2"] Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.627547 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf8f8be1-97ee-48aa-96cd-d88d2a29da73-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6ggwm\" (UID: \"cf8f8be1-97ee-48aa-96cd-d88d2a29da73\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6ggwm" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.647078 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-lftlx" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.653294 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbz8p\" (UniqueName: \"kubernetes.io/projected/5cf4d35e-ea62-4c76-814b-c2224558d011-kube-api-access-lbz8p\") pod \"olm-operator-6b444d44fb-x959d\" (UID: \"5cf4d35e-ea62-4c76-814b-c2224558d011\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x959d" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.656343 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lnz4f"] Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.664581 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wgw78" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.670070 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-87xcv"] Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.670908 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x959d" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.675334 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjtrd\" (UniqueName: \"kubernetes.io/projected/fbea48c4-eae5-4fab-bd21-8c69eb791cb2-kube-api-access-kjtrd\") pod \"ingress-operator-5b745b69d9-z22gj\" (UID: \"fbea48c4-eae5-4fab-bd21-8c69eb791cb2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z22gj" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.686054 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-kkmzn" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.688381 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrm86\" (UniqueName: \"kubernetes.io/projected/b9d77e51-e119-4f9e-b31f-90b7fec2c5be-kube-api-access-qrm86\") pod \"machine-config-controller-84d6567774-z7rvn\" (UID: \"b9d77e51-e119-4f9e-b31f-90b7fec2c5be\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7rvn" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.689569 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:43 crc kubenswrapper[4735]: E0223 00:09:43.690105 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:44.190087717 +0000 UTC m=+142.653633688 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.694448 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-f77lm" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.707660 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msn77\" (UniqueName: \"kubernetes.io/projected/33a3fbdd-f167-4093-8616-f64abea28e24-kube-api-access-msn77\") pod \"cluster-samples-operator-665b6dd947-chd9d\" (UID: \"33a3fbdd-f167-4093-8616-f64abea28e24\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-chd9d" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.712190 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kkhmq" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.740020 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvfx4" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.760406 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8g4g7" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.761705 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4rf7\" (UniqueName: \"kubernetes.io/projected/6d167793-63cc-42a9-9986-0cbb627e2ee4-kube-api-access-h4rf7\") pod \"catalog-operator-68c6474976-l5546\" (UID: \"6d167793-63cc-42a9-9986-0cbb627e2ee4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l5546" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.762688 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-qnx85"] Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.763541 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-nhtzb"] Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.767971 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qhf9\" (UniqueName: \"kubernetes.io/projected/845be7d4-ea70-441f-8af2-c44870256906-kube-api-access-8qhf9\") pod \"collect-profiles-29530080-85k47\" (UID: \"845be7d4-ea70-441f-8af2-c44870256906\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530080-85k47" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.771310 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsphb\" (UniqueName: \"kubernetes.io/projected/bd601b38-6549-439d-b1de-58e5c5c5c769-kube-api-access-vsphb\") pod \"service-ca-9c57cc56f-h9hzt\" (UID: \"bd601b38-6549-439d-b1de-58e5c5c5c769\") " pod="openshift-service-ca/service-ca-9c57cc56f-h9hzt" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.789492 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98bjf\" (UniqueName: \"kubernetes.io/projected/1bff1a80-3f8e-4297-9f56-701eea3c44f4-kube-api-access-98bjf\") pod \"packageserver-d55dfcdfc-k8wg7\" (UID: \"1bff1a80-3f8e-4297-9f56-701eea3c44f4\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8wg7" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.790302 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:43 crc kubenswrapper[4735]: E0223 00:09:43.790905 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:44.290886393 +0000 UTC m=+142.754432364 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.813774 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xmz5p"] Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.815933 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fbea48c4-eae5-4fab-bd21-8c69eb791cb2-bound-sa-token\") pod \"ingress-operator-5b745b69d9-z22gj\" (UID: \"fbea48c4-eae5-4fab-bd21-8c69eb791cb2\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z22gj" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.836061 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-knx8j"] Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.843864 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhm8l\" (UniqueName: \"kubernetes.io/projected/d1190643-6cbb-4ecd-b76e-3fe0f8a7ec57-kube-api-access-nhm8l\") pod \"ingress-canary-42hhs\" (UID: \"d1190643-6cbb-4ecd-b76e-3fe0f8a7ec57\") " pod="openshift-ingress-canary/ingress-canary-42hhs" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.859352 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slkrn\" (UniqueName: \"kubernetes.io/projected/3f7bc3cf-ab6c-413d-b5a2-9b970f5a5d85-kube-api-access-slkrn\") pod \"dns-default-5w4w6\" (UID: \"3f7bc3cf-ab6c-413d-b5a2-9b970f5a5d85\") " pod="openshift-dns/dns-default-5w4w6" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.884690 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p98l2\" (UniqueName: \"kubernetes.io/projected/9af6d235-3d3f-49c9-9338-e7302ef9deae-kube-api-access-p98l2\") pod \"csi-hostpathplugin-vh89z\" (UID: \"9af6d235-3d3f-49c9-9338-e7302ef9deae\") " pod="hostpath-provisioner/csi-hostpathplugin-vh89z" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.888621 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-d2dh9"] Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.891580 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:43 crc kubenswrapper[4735]: E0223 00:09:43.891966 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:44.391948645 +0000 UTC m=+142.855494616 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.922027 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-2mld6" event={"ID":"f3bbf6e9-0c6a-4d4a-9d35-25f941df2308","Type":"ContainerStarted","Data":"95623ffc52ff7ba9f105e45cec1d8d5997989e54936dea07623b517136ab9e5e"} Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.922085 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-2mld6" event={"ID":"f3bbf6e9-0c6a-4d4a-9d35-25f941df2308","Type":"ContainerStarted","Data":"79c349289b2081139cf83a91a9ba1ee3de5175fb18b25a6eef81043e804c1a61"} Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.922864 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-2mld6" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.923830 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fpbc2" event={"ID":"58844231-adcc-497d-83a3-bba779038cc2","Type":"ContainerStarted","Data":"f01338ae3147e30c930dd481e60983ff4a7bcc329158503b8e935162d422a10d"} Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.924824 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-xgbf2" event={"ID":"a8e73ee2-9e80-4123-bece-02008233348f","Type":"ContainerStarted","Data":"ce786bb8b300cafa1a35ca7323817aca7d03be34b1b00bb94525ce79e7282536"} Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.925454 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6ggwm" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.926643 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tjkjf"] Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.928004 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-4mwf4" event={"ID":"f0643721-5a54-4c37-b857-474ed61ef531","Type":"ContainerStarted","Data":"369938366c528f9de8f84365d1dbb3a7608b84dd5a606c2a4e212ecf481d0ede"} Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.928062 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-4mwf4" event={"ID":"f0643721-5a54-4c37-b857-474ed61ef531","Type":"ContainerStarted","Data":"f1a642fc1bf6902e723a32109d995429e5e61fc6b3d02ace30aa14e8ae3217f2"} Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.928078 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-4mwf4" event={"ID":"f0643721-5a54-4c37-b857-474ed61ef531","Type":"ContainerStarted","Data":"0329dd2707cff98682bdfcf4fb073afc46b2ec20089b2942c2a684ee55a0316d"} Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.939842 4735 patch_prober.go:28] interesting pod/console-operator-58897d9998-2mld6 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.939936 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-2mld6" podUID="f3bbf6e9-0c6a-4d4a-9d35-25f941df2308" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.940168 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-87xcv" event={"ID":"a14dd71a-cfac-4f4e-9cbf-89599011d970","Type":"ContainerStarted","Data":"0bc02b3acbc239e4531826dc3bb2700ecf05863322d3355eee3721cf7ed984b8"} Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.940419 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-chd9d" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.949904 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jqwls"] Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.956086 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l5546" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.956346 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-htb9k" event={"ID":"06270ded-7d6b-4966-9f10-a432c593bdfe","Type":"ContainerStarted","Data":"6127d59e86979d94a2084f4be118a851b1689a3ed8694d13166285bcc4b8c5dd"} Feb 23 00:09:43 crc kubenswrapper[4735]: W0223 00:09:43.962732 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6981202b_1a1f_4b0c_b68e_6a45d9914fc8.slice/crio-91fc4e9cba8463277fb0ad4e3598cac6abdb72fc366349188442f0df7cc3e057 WatchSource:0}: Error finding container 91fc4e9cba8463277fb0ad4e3598cac6abdb72fc366349188442f0df7cc3e057: Status 404 returned error can't find the container with id 91fc4e9cba8463277fb0ad4e3598cac6abdb72fc366349188442f0df7cc3e057 Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.968610 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dsqjf" event={"ID":"b40b3484-891d-4669-838b-90c1b8d6869e","Type":"ContainerStarted","Data":"12d65cfbb49815ffd20ce919433fa110c2d3c48be9af737b072dcd8d0c51337f"} Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.968653 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dsqjf" event={"ID":"b40b3484-891d-4669-838b-90c1b8d6869e","Type":"ContainerStarted","Data":"fa2e16523e0eaf1ce9e48fae7b19f40f73b95f8569aea2b5beb06eabb8e0c5c9"} Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.968952 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-dsqjf" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.971809 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7rvn" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.972664 4735 patch_prober.go:28] interesting pod/downloads-7954f5f757-dsqjf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.972700 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dsqjf" podUID="b40b3484-891d-4669-838b-90c1b8d6869e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.978775 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-h9hzt" Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.992565 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-9xsmt"] Feb 23 00:09:43 crc kubenswrapper[4735]: I0223 00:09:43.993723 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:43 crc kubenswrapper[4735]: E0223 00:09:43.995637 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:44.495618251 +0000 UTC m=+142.959164222 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:43.999486 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lnz4f" event={"ID":"e0c2b77e-cacd-4257-b3c1-699bd3f4261f","Type":"ContainerStarted","Data":"87d8325f2c9da0c1951d314e82d5d9288a02e730a865fcfaf63d915850ddbd1d"} Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.034167 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530080-85k47" Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.039443 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z22gj" Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.041612 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wgw78"] Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.042054 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-kkmzn"] Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.042215 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8wg7" Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.065392 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-42hhs" Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.072634 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27" event={"ID":"792fd1c0-de50-429b-89ff-6a3f64541e29","Type":"ContainerStarted","Data":"b614701c8cca4b240a681c3048bd0e93e7b92df5fcfe5aead5b3fcee08f272fc"} Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.074153 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5w4w6" Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.092661 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x959d"] Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.095419 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:44 crc kubenswrapper[4735]: E0223 00:09:44.095728 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:44.595714311 +0000 UTC m=+143.059260282 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.100366 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-vh89z" Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.128726 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-g5xff" event={"ID":"92d294e9-091b-420f-aae8-da3bcab119e4","Type":"ContainerStarted","Data":"28e4b7825ca4a956b37601061185609dfd4c0b9fd3823d2188e1bc41bda89670"} Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.128768 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-g5xff" event={"ID":"92d294e9-091b-420f-aae8-da3bcab119e4","Type":"ContainerStarted","Data":"fa15599029f0082c5fe3532f623f90834446952fe21c87924d51af853c5c3b9d"} Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.132117 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" event={"ID":"cf538d4f-1acd-4e61-9827-7430d1099138","Type":"ContainerStarted","Data":"abc155ad6df61fdd08392e424628e1cee5325189224dbd35ce2627b3683726ae"} Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.132157 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" event={"ID":"cf538d4f-1acd-4e61-9827-7430d1099138","Type":"ContainerStarted","Data":"36dbe32caa78c08a4de251b13d878b7abdd28dcb0ec2dcf42ff6beb905c768d7"} Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.132722 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.136244 4735 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-jvnk7 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.15:6443/healthz\": dial tcp 10.217.0.15:6443: connect: connection refused" start-of-body= Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.136323 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" podUID="cf538d4f-1acd-4e61-9827-7430d1099138" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.15:6443/healthz\": dial tcp 10.217.0.15:6443: connect: connection refused" Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.146596 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29530080-8vgj6" event={"ID":"6cb942f6-2bc5-4662-929c-849ec29baddc","Type":"ContainerStarted","Data":"0cd610a4df48fd174a26ab2f61b4eb6765cee4325326738785ad068fab9e5899"} Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.153224 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qnx85" event={"ID":"6af08a92-85a9-4200-a0d8-abff73e0e93b","Type":"ContainerStarted","Data":"45bd546c66942c2a906b7bd4cb49412c8703ea23ac643a2d2fc75de7cdaf71f1"} Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.159961 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wtlds" event={"ID":"5732e921-c103-4eda-9705-afec618471c9","Type":"ContainerStarted","Data":"e088cb523671e2f7293c835f49fb674346b14d1a7e18aaca00142c9c171c2b6b"} Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.160008 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wtlds" event={"ID":"5732e921-c103-4eda-9705-afec618471c9","Type":"ContainerStarted","Data":"e31e0d06e18af2ae5a96bfa45da454f0c07f62687accb6d5074ba4a1746c5821"} Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.163042 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qzhgh" event={"ID":"8bab8f58-226d-43fd-8a33-668c4e060bfb","Type":"ContainerStarted","Data":"2a6b6c8f5f8efcf07119369109d3526fd62b6d542688704d4663fdbebb09cb1f"} Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.173037 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6xqc" event={"ID":"25de1778-6e9d-4158-9aa4-02e1607d7f45","Type":"ContainerStarted","Data":"2703a3cb67f5d683979d8142eb05624f0b79a1b83a77baa096444c226b85cd39"} Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.173385 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6xqc" event={"ID":"25de1778-6e9d-4158-9aa4-02e1607d7f45","Type":"ContainerStarted","Data":"f7fd8bc945c60242fe77809b02a48d625e071f3f0ae869930536b2e3fa0da68f"} Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.175578 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6xqc" Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.175991 4735 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-c6xqc container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.176034 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6xqc" podUID="25de1778-6e9d-4158-9aa4-02e1607d7f45" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.191246 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lftlx"] Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.196707 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:44 crc kubenswrapper[4735]: E0223 00:09:44.198488 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:44.698464594 +0000 UTC m=+143.162010565 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.217043 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvfx4"] Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.257153 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-kkhmq"] Feb 23 00:09:44 crc kubenswrapper[4735]: W0223 00:09:44.292722 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod115b8c26_73db_4da1_a9b6_567a7ce7151e.slice/crio-3fcf9de95aaf75f17969c18f48c936b5d257ed4849e575afe0f339c9468551ed WatchSource:0}: Error finding container 3fcf9de95aaf75f17969c18f48c936b5d257ed4849e575afe0f339c9468551ed: Status 404 returned error can't find the container with id 3fcf9de95aaf75f17969c18f48c936b5d257ed4849e575afe0f339c9468551ed Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.299621 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:44 crc kubenswrapper[4735]: E0223 00:09:44.300034 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:44.800017608 +0000 UTC m=+143.263563579 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.338307 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l5546"] Feb 23 00:09:44 crc kubenswrapper[4735]: W0223 00:09:44.368258 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode160ff4c_3954_4858_a65a_6ca9b9051e88.slice/crio-671996046cb4958f539b98215dca1ee9c4444acd16626b067e809b2d689eeb4f WatchSource:0}: Error finding container 671996046cb4958f539b98215dca1ee9c4444acd16626b067e809b2d689eeb4f: Status 404 returned error can't find the container with id 671996046cb4958f539b98215dca1ee9c4444acd16626b067e809b2d689eeb4f Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.385754 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-dsqjf" podStartSLOduration=116.385734641 podStartE2EDuration="1m56.385734641s" podCreationTimestamp="2026-02-23 00:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:44.383401865 +0000 UTC m=+142.846947836" watchObservedRunningTime="2026-02-23 00:09:44.385734641 +0000 UTC m=+142.849280612" Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.405761 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:44 crc kubenswrapper[4735]: E0223 00:09:44.406186 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:44.906171923 +0000 UTC m=+143.369717894 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:44 crc kubenswrapper[4735]: W0223 00:09:44.411762 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d167793_63cc_42a9_9986_0cbb627e2ee4.slice/crio-3eb33528554b865394b6cf6f54683b69f07b05d114a921c48b2e381273f92193 WatchSource:0}: Error finding container 3eb33528554b865394b6cf6f54683b69f07b05d114a921c48b2e381273f92193: Status 404 returned error can't find the container with id 3eb33528554b865394b6cf6f54683b69f07b05d114a921c48b2e381273f92193 Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.488744 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6ggwm"] Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.507677 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:44 crc kubenswrapper[4735]: E0223 00:09:44.507955 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:45.007944233 +0000 UTC m=+143.471490204 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.609737 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:44 crc kubenswrapper[4735]: E0223 00:09:44.610553 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:45.110531732 +0000 UTC m=+143.574077693 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.616768 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8wg7"] Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.629490 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wtlds" podStartSLOduration=115.629468128 podStartE2EDuration="1m55.629468128s" podCreationTimestamp="2026-02-23 00:07:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:44.624926299 +0000 UTC m=+143.088472270" watchObservedRunningTime="2026-02-23 00:09:44.629468128 +0000 UTC m=+143.093014089" Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.680633 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qzhgh" podStartSLOduration=116.680621119 podStartE2EDuration="1m56.680621119s" podCreationTimestamp="2026-02-23 00:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:44.678651602 +0000 UTC m=+143.142197573" watchObservedRunningTime="2026-02-23 00:09:44.680621119 +0000 UTC m=+143.144167090" Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.680768 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-vh89z"] Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.712780 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:44 crc kubenswrapper[4735]: E0223 00:09:44.713158 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:45.213138152 +0000 UTC m=+143.676684123 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.821578 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:44 crc kubenswrapper[4735]: E0223 00:09:44.821763 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:45.321736416 +0000 UTC m=+143.785282387 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.822169 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:44 crc kubenswrapper[4735]: E0223 00:09:44.822536 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:45.322521535 +0000 UTC m=+143.786067506 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:44 crc kubenswrapper[4735]: W0223 00:09:44.828463 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf8f8be1_97ee_48aa_96cd_d88d2a29da73.slice/crio-095cddb8cc20ecbe97ed824789c43de98b28396271802c71f24e995cb118097b WatchSource:0}: Error finding container 095cddb8cc20ecbe97ed824789c43de98b28396271802c71f24e995cb118097b: Status 404 returned error can't find the container with id 095cddb8cc20ecbe97ed824789c43de98b28396271802c71f24e995cb118097b Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.924151 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:44 crc kubenswrapper[4735]: E0223 00:09:44.924545 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:45.424528171 +0000 UTC m=+143.888074142 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.950514 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" podStartSLOduration=116.950499755 podStartE2EDuration="1m56.950499755s" podCreationTimestamp="2026-02-23 00:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:44.949393618 +0000 UTC m=+143.412939589" watchObservedRunningTime="2026-02-23 00:09:44.950499755 +0000 UTC m=+143.414045726" Feb 23 00:09:44 crc kubenswrapper[4735]: I0223 00:09:44.971738 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-z7rvn"] Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.025550 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:45 crc kubenswrapper[4735]: E0223 00:09:45.025997 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:45.525982712 +0000 UTC m=+143.989528683 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.035235 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-g5xff" podStartSLOduration=117.035206225 podStartE2EDuration="1m57.035206225s" podCreationTimestamp="2026-02-23 00:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:44.991017071 +0000 UTC m=+143.454563032" watchObservedRunningTime="2026-02-23 00:09:45.035206225 +0000 UTC m=+143.498752196" Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.095954 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-chd9d"] Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.096084 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-h9hzt"] Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.136678 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:45 crc kubenswrapper[4735]: E0223 00:09:45.137294 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:45.637265221 +0000 UTC m=+144.100811192 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.137589 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:45 crc kubenswrapper[4735]: E0223 00:09:45.150690 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:45.650665713 +0000 UTC m=+144.114211684 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.169900 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-42hhs"] Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.223477 4735 generic.go:334] "Generic (PLEG): container finished" podID="a14dd71a-cfac-4f4e-9cbf-89599011d970" containerID="da2d5867d363ab277f7f5f2c0d98cdf5e573ed74580913e831fd8ea41692b6c3" exitCode=0 Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.223556 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-87xcv" event={"ID":"a14dd71a-cfac-4f4e-9cbf-89599011d970","Type":"ContainerDied","Data":"da2d5867d363ab277f7f5f2c0d98cdf5e573ed74580913e831fd8ea41692b6c3"} Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.244492 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:45 crc kubenswrapper[4735]: E0223 00:09:45.244942 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:45.744912992 +0000 UTC m=+144.208458963 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.261439 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-kkmzn" event={"ID":"a7c6f4b8-7812-4b99-8595-fb39bdf58a5b","Type":"ContainerStarted","Data":"34d22fca106c6af98c6845d7091dda9c06b6307bd01d325a675927b99ae1713e"} Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.265994 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530080-85k47"] Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.280353 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kkhmq" event={"ID":"84d06127-3e73-44c4-840b-6bfed7e36f84","Type":"ContainerStarted","Data":"b8593e6c2f3505afc2caf86954a11e645c38efa1373b998f711df4b76b75a021"} Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.309684 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8g4g7" event={"ID":"e160ff4c-3954-4858-a65a-6ca9b9051e88","Type":"ContainerStarted","Data":"671996046cb4958f539b98215dca1ee9c4444acd16626b067e809b2d689eeb4f"} Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.331402 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29530080-8vgj6" event={"ID":"6cb942f6-2bc5-4662-929c-849ec29baddc","Type":"ContainerStarted","Data":"bb9ab9016eedde4179bdbf86f86dfceb4cf2b8ddb2635d00305f46180285056f"} Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.335631 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-z22gj"] Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.342610 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6ggwm" event={"ID":"cf8f8be1-97ee-48aa-96cd-d88d2a29da73","Type":"ContainerStarted","Data":"095cddb8cc20ecbe97ed824789c43de98b28396271802c71f24e995cb118097b"} Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.343768 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-qzhgh" event={"ID":"8bab8f58-226d-43fd-8a33-668c4e060bfb","Type":"ContainerStarted","Data":"b21d5cca1826e514e4519133d21ae752b0c0bdef6dc38276aa205cfa400f7f68"} Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.345325 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:45 crc kubenswrapper[4735]: E0223 00:09:45.346014 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:45.846001336 +0000 UTC m=+144.309547297 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.346884 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-nhtzb" event={"ID":"6981202b-1a1f-4b0c-b68e-6a45d9914fc8","Type":"ContainerStarted","Data":"9b572d16b900194c267768d7ffb53293b855f752ffca6e70cd076c0bdf2c15b9"} Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.346910 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-nhtzb" event={"ID":"6981202b-1a1f-4b0c-b68e-6a45d9914fc8","Type":"ContainerStarted","Data":"91fc4e9cba8463277fb0ad4e3598cac6abdb72fc366349188442f0df7cc3e057"} Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.348767 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-4mwf4" podStartSLOduration=116.348745542 podStartE2EDuration="1m56.348745542s" podCreationTimestamp="2026-02-23 00:07:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:45.348727371 +0000 UTC m=+143.812273352" watchObservedRunningTime="2026-02-23 00:09:45.348745542 +0000 UTC m=+143.812291513" Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.355715 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wgw78" event={"ID":"3895d19d-15a0-442a-9156-726252aea7a1","Type":"ContainerStarted","Data":"7713ba1069bd9a4431d34c9c1eeb7c3510efc703ca6304beaf639eb8ee1aba8b"} Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.355798 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wgw78" event={"ID":"3895d19d-15a0-442a-9156-726252aea7a1","Type":"ContainerStarted","Data":"efa7d661b60d44113920ec7a73fd7b1bc96394b6f55860f984a994d5166e94c3"} Feb 23 00:09:45 crc kubenswrapper[4735]: W0223 00:09:45.358525 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod845be7d4_ea70_441f_8af2_c44870256906.slice/crio-a8d6e74bd420642313c860dd5a435255729bb10dad9746b51017fceefe49778e WatchSource:0}: Error finding container a8d6e74bd420642313c860dd5a435255729bb10dad9746b51017fceefe49778e: Status 404 returned error can't find the container with id a8d6e74bd420642313c860dd5a435255729bb10dad9746b51017fceefe49778e Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.359055 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l5546" event={"ID":"6d167793-63cc-42a9-9986-0cbb627e2ee4","Type":"ContainerStarted","Data":"3eb33528554b865394b6cf6f54683b69f07b05d114a921c48b2e381273f92193"} Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.364529 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8wg7" event={"ID":"1bff1a80-3f8e-4297-9f56-701eea3c44f4","Type":"ContainerStarted","Data":"9737067a87a35fba4d83f25e32c2cffeed7cbbc4410346599948773af42e5f04"} Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.369140 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tjkjf" event={"ID":"299ad63a-4584-4079-81c4-b8645326d3d0","Type":"ContainerStarted","Data":"6af1468a116382ceeef9fe83bb36864e6e9681d1dc040b15439acb88988d38b3"} Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.370182 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-tjkjf" Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.372424 4735 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-tjkjf container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.372487 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-tjkjf" podUID="299ad63a-4584-4079-81c4-b8645326d3d0" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.373043 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vh89z" event={"ID":"9af6d235-3d3f-49c9-9338-e7302ef9deae","Type":"ContainerStarted","Data":"1d30b1c0df0061ae64ff76d5b11877efbcc0ddb053b35e37f915fe3f8bd8c079"} Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.380050 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-d2dh9" event={"ID":"a4c1e470-e025-4285-a712-52b4fead0799","Type":"ContainerStarted","Data":"573539f0ca42167030c94e5acc17008542c40045e5e07d0bd1e04645f5108a76"} Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.381933 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6xqc" podStartSLOduration=116.38191932 podStartE2EDuration="1m56.38191932s" podCreationTimestamp="2026-02-23 00:07:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:45.379921652 +0000 UTC m=+143.843467623" watchObservedRunningTime="2026-02-23 00:09:45.38191932 +0000 UTC m=+143.845465291" Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.389056 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-knx8j" event={"ID":"fec1da66-feec-438b-a9d2-a8f36d8ef790","Type":"ContainerStarted","Data":"da0989ee9a33c2b32eda1d72dafce6370723d77743b6e5a1777dc255b9780669"} Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.389103 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-knx8j" event={"ID":"fec1da66-feec-438b-a9d2-a8f36d8ef790","Type":"ContainerStarted","Data":"d5ea7e67822891379c6e48f7b7e88023217b11a0163cea1e9ba9ec4d69efab4b"} Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.391663 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lftlx" event={"ID":"115b8c26-73db-4da1-a9b6-567a7ce7151e","Type":"ContainerStarted","Data":"3fcf9de95aaf75f17969c18f48c936b5d257ed4849e575afe0f339c9468551ed"} Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.395790 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xmz5p" event={"ID":"0b661fda-d14e-4491-896c-4d6812a638b5","Type":"ContainerStarted","Data":"50b498830bd9a9bdb94d3fe96a9cdd3652df87934d3bea33476184d2baeac199"} Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.411797 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5w4w6"] Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.414426 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fpbc2" event={"ID":"58844231-adcc-497d-83a3-bba779038cc2","Type":"ContainerStarted","Data":"37674b7dd11e993454cf9296464000d371864eadb768dd85e276bc12b726f400"} Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.415021 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-fpbc2" Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.423752 4735 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-fpbc2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.423831 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-fpbc2" podUID="58844231-adcc-497d-83a3-bba779038cc2" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.425636 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-2mld6" podStartSLOduration=117.425609612 podStartE2EDuration="1m57.425609612s" podCreationTimestamp="2026-02-23 00:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:45.424918055 +0000 UTC m=+143.888464026" watchObservedRunningTime="2026-02-23 00:09:45.425609612 +0000 UTC m=+143.889155573" Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.430715 4735 generic.go:334] "Generic (PLEG): container finished" podID="6af08a92-85a9-4200-a0d8-abff73e0e93b" containerID="3ec34ca14e9c6c787bcf5a53e27e3be83d5c24b019d62a29b4a795c2ad9f1ad4" exitCode=0 Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.430808 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qnx85" event={"ID":"6af08a92-85a9-4200-a0d8-abff73e0e93b","Type":"ContainerDied","Data":"3ec34ca14e9c6c787bcf5a53e27e3be83d5c24b019d62a29b4a795c2ad9f1ad4"} Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.446314 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:45 crc kubenswrapper[4735]: E0223 00:09:45.447874 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:45.947835106 +0000 UTC m=+144.411381077 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.448103 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7rvn" event={"ID":"b9d77e51-e119-4f9e-b31f-90b7fec2c5be","Type":"ContainerStarted","Data":"68766062a8964af6df963756c87daa0165000927acbd927b7d34b6e45b870b40"} Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.474944 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-htb9k" event={"ID":"06270ded-7d6b-4966-9f10-a432c593bdfe","Type":"ContainerStarted","Data":"c1cfe24eafd6ec32c6dc092e524b64dcbe4b559257c41440fdd46269fcf3cd02"} Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.483655 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvfx4" event={"ID":"11c76755-d9b6-472e-b761-97f73562d736","Type":"ContainerStarted","Data":"df33de2bef98ca97eae830b58b850ca69e1536b2b5d08351d2daf5639e568697"} Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.499838 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-xgbf2" event={"ID":"a8e73ee2-9e80-4123-bece-02008233348f","Type":"ContainerStarted","Data":"09cf45cbc8266ed12df0a0d98a4a101bf9dd0b00dd3fdc8bba5e43d4aea9216a"} Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.507179 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x959d" event={"ID":"5cf4d35e-ea62-4c76-814b-c2224558d011","Type":"ContainerStarted","Data":"472cb234ffbab0233e07f0fc9dd4d73371483814b42151e27ee4d9457779861d"} Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.538457 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jqwls" event={"ID":"defd5be8-af50-4f87-a9a5-c166a9e3ce44","Type":"ContainerStarted","Data":"b41feab68b4e4d3a8c4447ff69f28fc319da74bfe3bd9e840a7fee26d85a63fe"} Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.546016 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-f77lm" event={"ID":"bfb40dcb-d460-44f4-a461-fc423b7aed52","Type":"ContainerStarted","Data":"3587756ae5711177ec1fff1ddc532a15393b830232fb13d19481c5638a9dcbc8"} Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.548209 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:45 crc kubenswrapper[4735]: E0223 00:09:45.548655 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:46.048637313 +0000 UTC m=+144.512183344 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.550737 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9xsmt" event={"ID":"f4dd19db-78f4-45d0-a0e6-91505f2ade61","Type":"ContainerStarted","Data":"dd612e22e6ccf9a551596893e24424230ee90c1c2c30928654efc2ab1454de8d"} Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.556218 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lnz4f" event={"ID":"e0c2b77e-cacd-4257-b3c1-699bd3f4261f","Type":"ContainerStarted","Data":"415e5187d420e3cce2eeaf98193501a5dc75ded433cc55754dc0d52ea8f4900b"} Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.561441 4735 generic.go:334] "Generic (PLEG): container finished" podID="792fd1c0-de50-429b-89ff-6a3f64541e29" containerID="8e7fd1e8a69ca781d45bce29282c7802075e28f20275b5aee22d0577b1968b2d" exitCode=0 Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.561497 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27" event={"ID":"792fd1c0-de50-429b-89ff-6a3f64541e29","Type":"ContainerDied","Data":"8e7fd1e8a69ca781d45bce29282c7802075e28f20275b5aee22d0577b1968b2d"} Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.563040 4735 patch_prober.go:28] interesting pod/downloads-7954f5f757-dsqjf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.563090 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dsqjf" podUID="b40b3484-891d-4669-838b-90c1b8d6869e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.568416 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6xqc" Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.568832 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.575761 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-2mld6" Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.651333 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:45 crc kubenswrapper[4735]: E0223 00:09:45.652728 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:46.152705508 +0000 UTC m=+144.616251569 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.665527 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-nhtzb" podStartSLOduration=117.665513406 podStartE2EDuration="1m57.665513406s" podCreationTimestamp="2026-02-23 00:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:45.631811105 +0000 UTC m=+144.095357076" watchObservedRunningTime="2026-02-23 00:09:45.665513406 +0000 UTC m=+144.129059377" Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.701979 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lnz4f" podStartSLOduration=116.701961174 podStartE2EDuration="1m56.701961174s" podCreationTimestamp="2026-02-23 00:07:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:45.667276469 +0000 UTC m=+144.130822460" watchObservedRunningTime="2026-02-23 00:09:45.701961174 +0000 UTC m=+144.165507145" Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.703052 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29530080-8vgj6" podStartSLOduration=117.70304591 podStartE2EDuration="1m57.70304591s" podCreationTimestamp="2026-02-23 00:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:45.701304377 +0000 UTC m=+144.164850338" watchObservedRunningTime="2026-02-23 00:09:45.70304591 +0000 UTC m=+144.166591881" Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.753724 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:45 crc kubenswrapper[4735]: E0223 00:09:45.754466 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:46.254188071 +0000 UTC m=+144.717734042 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.793293 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-xgbf2" podStartSLOduration=116.793273451 podStartE2EDuration="1m56.793273451s" podCreationTimestamp="2026-02-23 00:07:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:45.787539963 +0000 UTC m=+144.251085924" watchObservedRunningTime="2026-02-23 00:09:45.793273451 +0000 UTC m=+144.256819422" Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.858670 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:45 crc kubenswrapper[4735]: E0223 00:09:45.859947 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:46.359928946 +0000 UTC m=+144.823474917 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.877325 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-tjkjf" podStartSLOduration=117.877306174 podStartE2EDuration="1m57.877306174s" podCreationTimestamp="2026-02-23 00:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:45.829072514 +0000 UTC m=+144.292618485" watchObservedRunningTime="2026-02-23 00:09:45.877306174 +0000 UTC m=+144.340852145" Feb 23 00:09:45 crc kubenswrapper[4735]: I0223 00:09:45.966563 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:45 crc kubenswrapper[4735]: E0223 00:09:45.966927 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:46.466914371 +0000 UTC m=+144.930460342 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.078374 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:46 crc kubenswrapper[4735]: E0223 00:09:46.078535 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:46.578505707 +0000 UTC m=+145.042051668 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.079029 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:46 crc kubenswrapper[4735]: E0223 00:09:46.079365 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:46.579352147 +0000 UTC m=+145.042898118 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.134715 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xmz5p" podStartSLOduration=117.13469247 podStartE2EDuration="1m57.13469247s" podCreationTimestamp="2026-02-23 00:07:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:46.132329403 +0000 UTC m=+144.595875374" watchObservedRunningTime="2026-02-23 00:09:46.13469247 +0000 UTC m=+144.598238441" Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.135649 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-fpbc2" podStartSLOduration=117.135642503 podStartE2EDuration="1m57.135642503s" podCreationTimestamp="2026-02-23 00:07:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:46.106088182 +0000 UTC m=+144.569634153" watchObservedRunningTime="2026-02-23 00:09:46.135642503 +0000 UTC m=+144.599188474" Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.180948 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:46 crc kubenswrapper[4735]: E0223 00:09:46.181546 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:46.681529237 +0000 UTC m=+145.145075208 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.183742 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-wgw78" podStartSLOduration=117.183695829 podStartE2EDuration="1m57.183695829s" podCreationTimestamp="2026-02-23 00:07:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:46.182278135 +0000 UTC m=+144.645824106" watchObservedRunningTime="2026-02-23 00:09:46.183695829 +0000 UTC m=+144.647241800" Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.282992 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:46 crc kubenswrapper[4735]: E0223 00:09:46.283330 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:46.783316517 +0000 UTC m=+145.246862488 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.383994 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:46 crc kubenswrapper[4735]: E0223 00:09:46.384193 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:46.884153614 +0000 UTC m=+145.347699585 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.386832 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:46 crc kubenswrapper[4735]: E0223 00:09:46.387404 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:46.887389352 +0000 UTC m=+145.350935323 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.488772 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:46 crc kubenswrapper[4735]: E0223 00:09:46.488934 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:46.988904116 +0000 UTC m=+145.452450087 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.489191 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:46 crc kubenswrapper[4735]: E0223 00:09:46.489551 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:46.989543441 +0000 UTC m=+145.453089412 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.591293 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:46 crc kubenswrapper[4735]: E0223 00:09:46.591821 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:47.091803962 +0000 UTC m=+145.555349933 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.610945 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6ggwm" event={"ID":"cf8f8be1-97ee-48aa-96cd-d88d2a29da73","Type":"ContainerStarted","Data":"c30478f17d9990244d5ea4d7e1a6b4a998bd36344c42801e0ed5153ff7176078"} Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.648514 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6ggwm" podStartSLOduration=117.648497467 podStartE2EDuration="1m57.648497467s" podCreationTimestamp="2026-02-23 00:07:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:46.646578281 +0000 UTC m=+145.110124252" watchObservedRunningTime="2026-02-23 00:09:46.648497467 +0000 UTC m=+145.112043438" Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.665316 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-htb9k" event={"ID":"06270ded-7d6b-4966-9f10-a432c593bdfe","Type":"ContainerStarted","Data":"3573c25acb898fe93d6d8418e0f4696fa4014684f26cbc4c92fe1768730a7e42"} Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.693439 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:46 crc kubenswrapper[4735]: E0223 00:09:46.693813 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:47.193792747 +0000 UTC m=+145.657338718 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.699048 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jqwls" event={"ID":"defd5be8-af50-4f87-a9a5-c166a9e3ce44","Type":"ContainerStarted","Data":"e882582c8f4f3555eb08d7c27617f4dc8ebc9299c27464307c28212fbdc75f6c"} Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.736302 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-htb9k" podStartSLOduration=118.73626925 podStartE2EDuration="1m58.73626925s" podCreationTimestamp="2026-02-23 00:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:46.688456709 +0000 UTC m=+145.152002680" watchObservedRunningTime="2026-02-23 00:09:46.73626925 +0000 UTC m=+145.199815221" Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.759946 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8wg7" event={"ID":"1bff1a80-3f8e-4297-9f56-701eea3c44f4","Type":"ContainerStarted","Data":"928587b2692e7a01f30b872d3acef1b27142f9b582ff34aa28e993a138f777fb"} Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.760303 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8wg7" Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.785902 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jqwls" podStartSLOduration=118.785873314 podStartE2EDuration="1m58.785873314s" podCreationTimestamp="2026-02-23 00:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:46.735387539 +0000 UTC m=+145.198933510" watchObservedRunningTime="2026-02-23 00:09:46.785873314 +0000 UTC m=+145.249419285" Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.785960 4735 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-k8wg7 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:5443/healthz\": dial tcp 10.217.0.28:5443: connect: connection refused" start-of-body= Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.791532 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8wg7" podUID="1bff1a80-3f8e-4297-9f56-701eea3c44f4" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.28:5443/healthz\": dial tcp 10.217.0.28:5443: connect: connection refused" Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.792337 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lftlx" event={"ID":"115b8c26-73db-4da1-a9b6-567a7ce7151e","Type":"ContainerStarted","Data":"56e90a6a609069896db50f44730cfe76cae07ab40142a544efb6a6aaf7b0d98e"} Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.792947 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8wg7" podStartSLOduration=117.792914733 podStartE2EDuration="1m57.792914733s" podCreationTimestamp="2026-02-23 00:07:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:46.78526973 +0000 UTC m=+145.248815701" watchObservedRunningTime="2026-02-23 00:09:46.792914733 +0000 UTC m=+145.256460704" Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.802128 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:46 crc kubenswrapper[4735]: E0223 00:09:46.802486 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:47.302452783 +0000 UTC m=+145.765998754 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.803692 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.804369 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530080-85k47" event={"ID":"845be7d4-ea70-441f-8af2-c44870256906","Type":"ContainerStarted","Data":"a8d6e74bd420642313c860dd5a435255729bb10dad9746b51017fceefe49778e"} Feb 23 00:09:46 crc kubenswrapper[4735]: E0223 00:09:46.806115 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:47.306097941 +0000 UTC m=+145.769643912 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.816185 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-kkmzn" event={"ID":"a7c6f4b8-7812-4b99-8595-fb39bdf58a5b","Type":"ContainerStarted","Data":"06b2edd8eb74c6217156a098fee08e6edd328c22d8f1fb564c6ee3c34b6becf8"} Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.828372 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5w4w6" event={"ID":"3f7bc3cf-ab6c-413d-b5a2-9b970f5a5d85","Type":"ContainerStarted","Data":"caf97fc3ac539b4653637d06f7fc4fdc0607cd5fe9330a840219eebaf52ef715"} Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.843733 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8g4g7" event={"ID":"e160ff4c-3954-4858-a65a-6ca9b9051e88","Type":"ContainerStarted","Data":"1abedf741c33b54bcecb376d6f6a4ee04d24695b86cc3480b78fe5356fbddfc3"} Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.851125 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tjkjf" event={"ID":"299ad63a-4584-4079-81c4-b8645326d3d0","Type":"ContainerStarted","Data":"25beb31bd86154f73322793c0ede04c047f19979f1ad9a63928c3fa015dc3c62"} Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.861040 4735 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-tjkjf container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.861108 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-tjkjf" podUID="299ad63a-4584-4079-81c4-b8645326d3d0" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.866374 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29530080-85k47" podStartSLOduration=118.866340391 podStartE2EDuration="1m58.866340391s" podCreationTimestamp="2026-02-23 00:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:46.844029664 +0000 UTC m=+145.307575655" watchObservedRunningTime="2026-02-23 00:09:46.866340391 +0000 UTC m=+145.329886362" Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.869693 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-xmz5p" event={"ID":"0b661fda-d14e-4491-896c-4d6812a638b5","Type":"ContainerStarted","Data":"fb68d9424d4adf0afc0abf5631a82e83535d6ff85531cd2bd66e14412917d03a"} Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.891048 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-8g4g7" podStartSLOduration=6.891017305 podStartE2EDuration="6.891017305s" podCreationTimestamp="2026-02-23 00:09:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:46.86918017 +0000 UTC m=+145.332726141" watchObservedRunningTime="2026-02-23 00:09:46.891017305 +0000 UTC m=+145.354563276" Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.892448 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-d2dh9" event={"ID":"a4c1e470-e025-4285-a712-52b4fead0799","Type":"ContainerStarted","Data":"8f5ee528322bf89ccc7368852172cb70118bafd79c32c1316fea9229bd994ada"} Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.892554 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-d2dh9" event={"ID":"a4c1e470-e025-4285-a712-52b4fead0799","Type":"ContainerStarted","Data":"255875bc0fd16fec9ef74bed14cfbf87b7ddb3e82e6c75f7d4d13ee00b14628b"} Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.907467 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-kkmzn" podStartSLOduration=117.907438 podStartE2EDuration="1m57.907438s" podCreationTimestamp="2026-02-23 00:07:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:46.905087023 +0000 UTC m=+145.368632994" watchObservedRunningTime="2026-02-23 00:09:46.907438 +0000 UTC m=+145.370983971" Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.911938 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:46 crc kubenswrapper[4735]: E0223 00:09:46.915129 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:47.415099044 +0000 UTC m=+145.878645015 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.925486 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvfx4" event={"ID":"11c76755-d9b6-472e-b761-97f73562d736","Type":"ContainerStarted","Data":"7b8f9a523ab4c55c267e72af490dfb53544ec3bcaae0bc0b81fec94650e2911a"} Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.941325 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-d2dh9" podStartSLOduration=117.941303975 podStartE2EDuration="1m57.941303975s" podCreationTimestamp="2026-02-23 00:07:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:46.937988806 +0000 UTC m=+145.401534777" watchObservedRunningTime="2026-02-23 00:09:46.941303975 +0000 UTC m=+145.404849946" Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.944107 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qnx85" Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.971561 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-chd9d" event={"ID":"33a3fbdd-f167-4093-8616-f64abea28e24","Type":"ContainerStarted","Data":"02f623ca311e387afd5615c3c5584c18e4f06bffb3fa28f27470037fdf828cae"} Feb 23 00:09:46 crc kubenswrapper[4735]: I0223 00:09:46.993026 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7rvn" event={"ID":"b9d77e51-e119-4f9e-b31f-90b7fec2c5be","Type":"ContainerStarted","Data":"af01379a3d430628be564546d30c0221aaef89757568b876fcc57c8cfef8b25f"} Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.005590 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mvfx4" podStartSLOduration=118.005565102 podStartE2EDuration="1m58.005565102s" podCreationTimestamp="2026-02-23 00:07:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:46.976174985 +0000 UTC m=+145.439720956" watchObservedRunningTime="2026-02-23 00:09:47.005565102 +0000 UTC m=+145.469111073" Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.013741 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:47 crc kubenswrapper[4735]: E0223 00:09:47.015797 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:47.515783338 +0000 UTC m=+145.979329499 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.022957 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-87xcv" event={"ID":"a14dd71a-cfac-4f4e-9cbf-89599011d970","Type":"ContainerStarted","Data":"c5ac0f554abe47b24a336ac7c29620c9233ff8b2906fd3df83cb26f8be16b6f0"} Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.027237 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-42hhs" event={"ID":"d1190643-6cbb-4ecd-b76e-3fe0f8a7ec57","Type":"ContainerStarted","Data":"34653f4a079a9aebd6b3e8d69e0a5845ab8cd7e07b21e98e8868ae53543fe822"} Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.027306 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-42hhs" event={"ID":"d1190643-6cbb-4ecd-b76e-3fe0f8a7ec57","Type":"ContainerStarted","Data":"75c777175df5e01108883afe560bdbfbac5e204370be8d9e1fffbab1e16402a5"} Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.027320 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qnx85" podStartSLOduration=119.027284994 podStartE2EDuration="1m59.027284994s" podCreationTimestamp="2026-02-23 00:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:47.006446754 +0000 UTC m=+145.469992715" watchObservedRunningTime="2026-02-23 00:09:47.027284994 +0000 UTC m=+145.490830965" Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.077755 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7rvn" podStartSLOduration=118.07773772 podStartE2EDuration="1m58.07773772s" podCreationTimestamp="2026-02-23 00:07:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:47.050031453 +0000 UTC m=+145.513577424" watchObservedRunningTime="2026-02-23 00:09:47.07773772 +0000 UTC m=+145.541283691" Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.115375 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:47 crc kubenswrapper[4735]: E0223 00:09:47.116445 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:47.616430471 +0000 UTC m=+146.079976442 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.123485 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-knx8j" event={"ID":"fec1da66-feec-438b-a9d2-a8f36d8ef790","Type":"ContainerStarted","Data":"79e63bcfabd646b632828a97a05dd6b992aa34c55871b969d702adf67b23656e"} Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.124136 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-knx8j" Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.130556 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kkhmq" event={"ID":"84d06127-3e73-44c4-840b-6bfed7e36f84","Type":"ContainerStarted","Data":"87b35b8138ddf46b55fd03f98bebe21f838d7a38d7723fb319dfaaf7d2f3e8c1"} Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.152113 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-f77lm" event={"ID":"bfb40dcb-d460-44f4-a461-fc423b7aed52","Type":"ContainerStarted","Data":"8b9c43ba646535a5f72a7e5b6d5e9cf4f8a3d54f34154a470395d971852e567b"} Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.174456 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-42hhs" podStartSLOduration=7.174438407 podStartE2EDuration="7.174438407s" podCreationTimestamp="2026-02-23 00:09:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:47.078773744 +0000 UTC m=+145.542319715" watchObservedRunningTime="2026-02-23 00:09:47.174438407 +0000 UTC m=+145.637984378" Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.175161 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9xsmt" event={"ID":"f4dd19db-78f4-45d0-a0e6-91505f2ade61","Type":"ContainerStarted","Data":"0dddf5cdf1f86346c223064cd42330b57f6f7b170e21d74f30babd610a6ae173"} Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.176297 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-knx8j" podStartSLOduration=118.176288592 podStartE2EDuration="1m58.176288592s" podCreationTimestamp="2026-02-23 00:07:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:47.17328902 +0000 UTC m=+145.636834991" watchObservedRunningTime="2026-02-23 00:09:47.176288592 +0000 UTC m=+145.639834563" Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.190765 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l5546" event={"ID":"6d167793-63cc-42a9-9986-0cbb627e2ee4","Type":"ContainerStarted","Data":"3cc0275639464615f8a6ad1a87be40cac5aef58ab18b40743256828101a817f2"} Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.191876 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l5546" Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.192953 4735 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-l5546 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.193000 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l5546" podUID="6d167793-63cc-42a9-9986-0cbb627e2ee4" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.218875 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:47 crc kubenswrapper[4735]: E0223 00:09:47.222857 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:47.722831642 +0000 UTC m=+146.186377613 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.244161 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x959d" event={"ID":"5cf4d35e-ea62-4c76-814b-c2224558d011","Type":"ContainerStarted","Data":"beda9781c55f4bbeea6d65522c2c58b207ee994fc944815ef60c928467452cdc"} Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.245443 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x959d" Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.263681 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-f77lm" podStartSLOduration=118.263611503 podStartE2EDuration="1m58.263611503s" podCreationTimestamp="2026-02-23 00:07:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:47.245696412 +0000 UTC m=+145.709242383" watchObservedRunningTime="2026-02-23 00:09:47.263611503 +0000 UTC m=+145.727157474" Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.266885 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kkhmq" podStartSLOduration=118.266862312 podStartE2EDuration="1m58.266862312s" podCreationTimestamp="2026-02-23 00:07:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:47.203136398 +0000 UTC m=+145.666682369" watchObservedRunningTime="2026-02-23 00:09:47.266862312 +0000 UTC m=+145.730408283" Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.267621 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-h9hzt" event={"ID":"bd601b38-6549-439d-b1de-58e5c5c5c769","Type":"ContainerStarted","Data":"d8f24925574fd549471454c07ccbc78da8dca2c34bcf4a2a704e3cbb09a89b62"} Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.271207 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-h9hzt" event={"ID":"bd601b38-6549-439d-b1de-58e5c5c5c769","Type":"ContainerStarted","Data":"eb4a153da7d6e560cf86de45e65609b13cc3f8d59e7d05cd99f1aacf9885defe"} Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.296844 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z22gj" event={"ID":"fbea48c4-eae5-4fab-bd21-8c69eb791cb2","Type":"ContainerStarted","Data":"8c6920075aede00499b33d736538c0df801e528afdc2bd42f1daafcf754c7f3b"} Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.296913 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z22gj" event={"ID":"fbea48c4-eae5-4fab-bd21-8c69eb791cb2","Type":"ContainerStarted","Data":"971a866c6d1ffeb687650207f1dad7d81a69105bef3234f813c8e95d5d986725"} Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.304920 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-fpbc2" Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.327107 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:47 crc kubenswrapper[4735]: E0223 00:09:47.328367 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:47.828339452 +0000 UTC m=+146.291885423 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.328739 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l5546" podStartSLOduration=118.328701541 podStartE2EDuration="1m58.328701541s" podCreationTimestamp="2026-02-23 00:07:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:47.293390231 +0000 UTC m=+145.756936212" watchObservedRunningTime="2026-02-23 00:09:47.328701541 +0000 UTC m=+145.792247512" Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.345831 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x959d" Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.345827 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9xsmt" podStartSLOduration=118.345806712 podStartE2EDuration="1m58.345806712s" podCreationTimestamp="2026-02-23 00:07:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:47.325601006 +0000 UTC m=+145.789146977" watchObservedRunningTime="2026-02-23 00:09:47.345806712 +0000 UTC m=+145.809352683" Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.394816 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x959d" podStartSLOduration=118.394794751 podStartE2EDuration="1m58.394794751s" podCreationTimestamp="2026-02-23 00:07:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:47.370048435 +0000 UTC m=+145.833594406" watchObservedRunningTime="2026-02-23 00:09:47.394794751 +0000 UTC m=+145.858340722" Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.430200 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.440819 4735 csr.go:261] certificate signing request csr-v9q9k is approved, waiting to be issued Feb 23 00:09:47 crc kubenswrapper[4735]: E0223 00:09:47.442020 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:47.942004068 +0000 UTC m=+146.405550039 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.485036 4735 csr.go:257] certificate signing request csr-v9q9k is issued Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.494622 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z22gj" podStartSLOduration=118.494608423 podStartE2EDuration="1m58.494608423s" podCreationTimestamp="2026-02-23 00:07:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:47.494276946 +0000 UTC m=+145.957822917" watchObservedRunningTime="2026-02-23 00:09:47.494608423 +0000 UTC m=+145.958154394" Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.533967 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:47 crc kubenswrapper[4735]: E0223 00:09:47.534626 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:48.034601937 +0000 UTC m=+146.498147908 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.537030 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-h9hzt" podStartSLOduration=118.537010335 podStartE2EDuration="1m58.537010335s" podCreationTimestamp="2026-02-23 00:07:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:47.534423712 +0000 UTC m=+145.997969683" watchObservedRunningTime="2026-02-23 00:09:47.537010335 +0000 UTC m=+146.000556306" Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.635580 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:47 crc kubenswrapper[4735]: E0223 00:09:47.635915 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:48.135902715 +0000 UTC m=+146.599448686 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.695448 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-f77lm" Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.701516 4735 patch_prober.go:28] interesting pod/router-default-5444994796-f77lm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 00:09:47 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Feb 23 00:09:47 crc kubenswrapper[4735]: [+]process-running ok Feb 23 00:09:47 crc kubenswrapper[4735]: healthz check failed Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.701765 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f77lm" podUID="bfb40dcb-d460-44f4-a461-fc423b7aed52" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.737265 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:47 crc kubenswrapper[4735]: E0223 00:09:47.737449 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:48.237422849 +0000 UTC m=+146.700968820 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.838741 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:47 crc kubenswrapper[4735]: E0223 00:09:47.839224 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:48.339206358 +0000 UTC m=+146.802752329 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.939821 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:47 crc kubenswrapper[4735]: E0223 00:09:47.939929 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:48.439913392 +0000 UTC m=+146.903459363 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:47 crc kubenswrapper[4735]: I0223 00:09:47.940187 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:47 crc kubenswrapper[4735]: E0223 00:09:47.940493 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:48.440485827 +0000 UTC m=+146.904031798 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.041402 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:48 crc kubenswrapper[4735]: E0223 00:09:48.041634 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:48.54160031 +0000 UTC m=+147.005146281 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.041828 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:48 crc kubenswrapper[4735]: E0223 00:09:48.042154 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:48.542146703 +0000 UTC m=+147.005692674 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.143179 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:48 crc kubenswrapper[4735]: E0223 00:09:48.143356 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:48.643330919 +0000 UTC m=+147.106876890 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.143414 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:48 crc kubenswrapper[4735]: E0223 00:09:48.143762 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:48.643749729 +0000 UTC m=+147.107295700 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.244901 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:48 crc kubenswrapper[4735]: E0223 00:09:48.245097 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:48.745071788 +0000 UTC m=+147.208617759 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.245637 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:48 crc kubenswrapper[4735]: E0223 00:09:48.245974 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:48.745961579 +0000 UTC m=+147.209507550 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.314701 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-87xcv" event={"ID":"a14dd71a-cfac-4f4e-9cbf-89599011d970","Type":"ContainerStarted","Data":"6ab1bb00e9f5d202d44a5128f641a4cfdef200e4c00cfba48f8e16f013c825b1"} Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.322376 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lftlx" event={"ID":"115b8c26-73db-4da1-a9b6-567a7ce7151e","Type":"ContainerStarted","Data":"4985b02ba12341d11ee63a02771f3c221c2cd7ec3feeebbf2ee40966ac1e3c1a"} Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.326712 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530080-85k47" event={"ID":"845be7d4-ea70-441f-8af2-c44870256906","Type":"ContainerStarted","Data":"28c45b2cbdcf3d8161c129cfc8c3d5e54fc40e166f96deaabb0661115f59ad2d"} Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.333551 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-chd9d" event={"ID":"33a3fbdd-f167-4093-8616-f64abea28e24","Type":"ContainerStarted","Data":"7c91be3ee0fc75c2f66525ec4699312e449eaaba9fee7d64157d3a655ecbb0ff"} Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.333637 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-chd9d" event={"ID":"33a3fbdd-f167-4093-8616-f64abea28e24","Type":"ContainerStarted","Data":"011227c5d7af1ec2c086ec6b8dd5390611fec0fa94f0ce072801e34e4e814ed5"} Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.337310 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-kkhmq" event={"ID":"84d06127-3e73-44c4-840b-6bfed7e36f84","Type":"ContainerStarted","Data":"c5ee551b567389caabe4e65572a480609345de559f5a083a0fd27fc77efeca67"} Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.339788 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-9xsmt" event={"ID":"f4dd19db-78f4-45d0-a0e6-91505f2ade61","Type":"ContainerStarted","Data":"51b1df37908c42735c3b09d6506355908351d75e9427767c58f7a0b628c9a620"} Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.341653 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z22gj" event={"ID":"fbea48c4-eae5-4fab-bd21-8c69eb791cb2","Type":"ContainerStarted","Data":"5316bd7cf60177cf4a3fd476eee2810b9dfd0a00b0972d789e9d8d7b46d39a7e"} Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.346003 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27" event={"ID":"792fd1c0-de50-429b-89ff-6a3f64541e29","Type":"ContainerStarted","Data":"d3f6ae21bf666130b5efe7b6a15e2588520b21b84ac587998611345299603ab8"} Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.347016 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:48 crc kubenswrapper[4735]: E0223 00:09:48.347108 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:48.847093303 +0000 UTC m=+147.310639274 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.347352 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:48 crc kubenswrapper[4735]: E0223 00:09:48.347632 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:48.847625876 +0000 UTC m=+147.311171847 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.349895 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-z7rvn" event={"ID":"b9d77e51-e119-4f9e-b31f-90b7fec2c5be","Type":"ContainerStarted","Data":"470a62f21ffa7c496956cbb7e823c77a619f8673b3fa044305fd0b2fca251047"} Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.352216 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-87xcv" podStartSLOduration=120.352199347 podStartE2EDuration="2m0.352199347s" podCreationTimestamp="2026-02-23 00:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:48.342590645 +0000 UTC m=+146.806136616" watchObservedRunningTime="2026-02-23 00:09:48.352199347 +0000 UTC m=+146.815745318" Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.355353 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vh89z" event={"ID":"9af6d235-3d3f-49c9-9338-e7302ef9deae","Type":"ContainerStarted","Data":"037d1a97d03db9a2eede6293902649fad9b03d1e3cc36f1451c2eebe6a40e923"} Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.357512 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5w4w6" event={"ID":"3f7bc3cf-ab6c-413d-b5a2-9b970f5a5d85","Type":"ContainerStarted","Data":"8f703aba51fb7cd67aa0a8b19a4a4e33178985d987d25f1b2d6394aa30c64749"} Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.357548 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5w4w6" event={"ID":"3f7bc3cf-ab6c-413d-b5a2-9b970f5a5d85","Type":"ContainerStarted","Data":"e2e441a7adca932b6fd0459748836fbb504b9d61fdc27f0d52672d031aaebdff"} Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.358397 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-5w4w6" Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.362675 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qnx85" event={"ID":"6af08a92-85a9-4200-a0d8-abff73e0e93b","Type":"ContainerStarted","Data":"a102ceef0cb4023ec3d59e84176f9ccb53de06c64bbd8bf22c24e248db9095f0"} Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.372781 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-l5546" Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.377995 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-lftlx" podStartSLOduration=119.377972847 podStartE2EDuration="1m59.377972847s" podCreationTimestamp="2026-02-23 00:07:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:48.376274306 +0000 UTC m=+146.839820267" watchObservedRunningTime="2026-02-23 00:09:48.377972847 +0000 UTC m=+146.841518828" Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.408593 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-tjkjf" Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.449056 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:48 crc kubenswrapper[4735]: E0223 00:09:48.452784 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:48.952742487 +0000 UTC m=+147.416288468 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.462914 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-chd9d" podStartSLOduration=120.462882351 podStartE2EDuration="2m0.462882351s" podCreationTimestamp="2026-02-23 00:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:48.400890019 +0000 UTC m=+146.864435990" watchObservedRunningTime="2026-02-23 00:09:48.462882351 +0000 UTC m=+146.926428332" Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.488211 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-23 00:04:47 +0000 UTC, rotation deadline is 2026-12-14 11:49:42.118034138 +0000 UTC Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.488244 4735 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7067h39m53.629792917s for next certificate rotation Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.492502 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27" podStartSLOduration=119.492469553 podStartE2EDuration="1m59.492469553s" podCreationTimestamp="2026-02-23 00:07:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:48.482900102 +0000 UTC m=+146.946446073" watchObservedRunningTime="2026-02-23 00:09:48.492469553 +0000 UTC m=+146.956015524" Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.518195 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-5w4w6" podStartSLOduration=8.518174480999999 podStartE2EDuration="8.518174481s" podCreationTimestamp="2026-02-23 00:09:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:48.5164522 +0000 UTC m=+146.979998171" watchObservedRunningTime="2026-02-23 00:09:48.518174481 +0000 UTC m=+146.981720442" Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.558572 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:48 crc kubenswrapper[4735]: E0223 00:09:48.558976 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:49.058963323 +0000 UTC m=+147.522509304 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.660226 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:48 crc kubenswrapper[4735]: E0223 00:09:48.661439 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:49.16141598 +0000 UTC m=+147.624961951 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.700025 4735 patch_prober.go:28] interesting pod/router-default-5444994796-f77lm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 00:09:48 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Feb 23 00:09:48 crc kubenswrapper[4735]: [+]process-running ok Feb 23 00:09:48 crc kubenswrapper[4735]: healthz check failed Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.700076 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f77lm" podUID="bfb40dcb-d460-44f4-a461-fc423b7aed52" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.761682 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:48 crc kubenswrapper[4735]: E0223 00:09:48.762041 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:49.262028581 +0000 UTC m=+147.725574552 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.862895 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:48 crc kubenswrapper[4735]: E0223 00:09:48.863097 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:49.363072504 +0000 UTC m=+147.826618475 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.863291 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:48 crc kubenswrapper[4735]: E0223 00:09:48.863594 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:49.363587446 +0000 UTC m=+147.827133417 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.964762 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:48 crc kubenswrapper[4735]: E0223 00:09:48.964942 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:49.464916254 +0000 UTC m=+147.928462225 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:48 crc kubenswrapper[4735]: I0223 00:09:48.965005 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:48 crc kubenswrapper[4735]: E0223 00:09:48.965395 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:49.465384196 +0000 UTC m=+147.928930167 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.066596 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:49 crc kubenswrapper[4735]: E0223 00:09:49.067029 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:49.567004032 +0000 UTC m=+148.030550003 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.082266 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8wg7" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.109602 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l4mrf"] Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.110456 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l4mrf" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.168176 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17e19891-1a63-4c0e-abd9-a161f96cf71e-utilities\") pod \"community-operators-l4mrf\" (UID: \"17e19891-1a63-4c0e-abd9-a161f96cf71e\") " pod="openshift-marketplace/community-operators-l4mrf" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.168362 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17e19891-1a63-4c0e-abd9-a161f96cf71e-catalog-content\") pod \"community-operators-l4mrf\" (UID: \"17e19891-1a63-4c0e-abd9-a161f96cf71e\") " pod="openshift-marketplace/community-operators-l4mrf" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.168449 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54h5v\" (UniqueName: \"kubernetes.io/projected/17e19891-1a63-4c0e-abd9-a161f96cf71e-kube-api-access-54h5v\") pod \"community-operators-l4mrf\" (UID: \"17e19891-1a63-4c0e-abd9-a161f96cf71e\") " pod="openshift-marketplace/community-operators-l4mrf" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.168549 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:49 crc kubenswrapper[4735]: E0223 00:09:49.168901 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:49.668883875 +0000 UTC m=+148.132429846 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.176240 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.178760 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l4mrf"] Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.270295 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.270737 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17e19891-1a63-4c0e-abd9-a161f96cf71e-utilities\") pod \"community-operators-l4mrf\" (UID: \"17e19891-1a63-4c0e-abd9-a161f96cf71e\") " pod="openshift-marketplace/community-operators-l4mrf" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.270891 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17e19891-1a63-4c0e-abd9-a161f96cf71e-catalog-content\") pod \"community-operators-l4mrf\" (UID: \"17e19891-1a63-4c0e-abd9-a161f96cf71e\") " pod="openshift-marketplace/community-operators-l4mrf" Feb 23 00:09:49 crc kubenswrapper[4735]: E0223 00:09:49.270994 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:49.770965362 +0000 UTC m=+148.234511333 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.271041 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54h5v\" (UniqueName: \"kubernetes.io/projected/17e19891-1a63-4c0e-abd9-a161f96cf71e-kube-api-access-54h5v\") pod \"community-operators-l4mrf\" (UID: \"17e19891-1a63-4c0e-abd9-a161f96cf71e\") " pod="openshift-marketplace/community-operators-l4mrf" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.271130 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:49 crc kubenswrapper[4735]: E0223 00:09:49.271465 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:49.771458893 +0000 UTC m=+148.235004864 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.271619 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17e19891-1a63-4c0e-abd9-a161f96cf71e-catalog-content\") pod \"community-operators-l4mrf\" (UID: \"17e19891-1a63-4c0e-abd9-a161f96cf71e\") " pod="openshift-marketplace/community-operators-l4mrf" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.271904 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17e19891-1a63-4c0e-abd9-a161f96cf71e-utilities\") pod \"community-operators-l4mrf\" (UID: \"17e19891-1a63-4c0e-abd9-a161f96cf71e\") " pod="openshift-marketplace/community-operators-l4mrf" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.300727 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-72lxc"] Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.301907 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-72lxc" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.312683 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.319973 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54h5v\" (UniqueName: \"kubernetes.io/projected/17e19891-1a63-4c0e-abd9-a161f96cf71e-kube-api-access-54h5v\") pod \"community-operators-l4mrf\" (UID: \"17e19891-1a63-4c0e-abd9-a161f96cf71e\") " pod="openshift-marketplace/community-operators-l4mrf" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.338173 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-72lxc"] Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.371963 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.372349 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f444dee5-d7dc-47ed-add6-b3b1148077f2-utilities\") pod \"certified-operators-72lxc\" (UID: \"f444dee5-d7dc-47ed-add6-b3b1148077f2\") " pod="openshift-marketplace/certified-operators-72lxc" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.372454 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kfbt\" (UniqueName: \"kubernetes.io/projected/f444dee5-d7dc-47ed-add6-b3b1148077f2-kube-api-access-4kfbt\") pod \"certified-operators-72lxc\" (UID: \"f444dee5-d7dc-47ed-add6-b3b1148077f2\") " pod="openshift-marketplace/certified-operators-72lxc" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.372582 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f444dee5-d7dc-47ed-add6-b3b1148077f2-catalog-content\") pod \"certified-operators-72lxc\" (UID: \"f444dee5-d7dc-47ed-add6-b3b1148077f2\") " pod="openshift-marketplace/certified-operators-72lxc" Feb 23 00:09:49 crc kubenswrapper[4735]: E0223 00:09:49.372733 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:49.872717761 +0000 UTC m=+148.336263732 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.428184 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l4mrf" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.429238 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vh89z" event={"ID":"9af6d235-3d3f-49c9-9338-e7302ef9deae","Type":"ContainerStarted","Data":"0c3462f09b23179bc006059cfa25388503bd10643ffa11108121bdbaa1614635"} Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.429297 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vh89z" event={"ID":"9af6d235-3d3f-49c9-9338-e7302ef9deae","Type":"ContainerStarted","Data":"c772596e46294e38ba8406ff84efdda0a61a48063c2bb9757fbba65ac1ff6e8e"} Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.445357 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qnx85" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.469086 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bzbm8"] Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.470235 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bzbm8" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.478934 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f444dee5-d7dc-47ed-add6-b3b1148077f2-catalog-content\") pod \"certified-operators-72lxc\" (UID: \"f444dee5-d7dc-47ed-add6-b3b1148077f2\") " pod="openshift-marketplace/certified-operators-72lxc" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.479126 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f444dee5-d7dc-47ed-add6-b3b1148077f2-utilities\") pod \"certified-operators-72lxc\" (UID: \"f444dee5-d7dc-47ed-add6-b3b1148077f2\") " pod="openshift-marketplace/certified-operators-72lxc" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.479398 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kfbt\" (UniqueName: \"kubernetes.io/projected/f444dee5-d7dc-47ed-add6-b3b1148077f2-kube-api-access-4kfbt\") pod \"certified-operators-72lxc\" (UID: \"f444dee5-d7dc-47ed-add6-b3b1148077f2\") " pod="openshift-marketplace/certified-operators-72lxc" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.479578 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.487589 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f444dee5-d7dc-47ed-add6-b3b1148077f2-catalog-content\") pod \"certified-operators-72lxc\" (UID: \"f444dee5-d7dc-47ed-add6-b3b1148077f2\") " pod="openshift-marketplace/certified-operators-72lxc" Feb 23 00:09:49 crc kubenswrapper[4735]: E0223 00:09:49.495770 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:49.995757023 +0000 UTC m=+148.459302994 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.499802 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f444dee5-d7dc-47ed-add6-b3b1148077f2-utilities\") pod \"certified-operators-72lxc\" (UID: \"f444dee5-d7dc-47ed-add6-b3b1148077f2\") " pod="openshift-marketplace/certified-operators-72lxc" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.545800 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kfbt\" (UniqueName: \"kubernetes.io/projected/f444dee5-d7dc-47ed-add6-b3b1148077f2-kube-api-access-4kfbt\") pod \"certified-operators-72lxc\" (UID: \"f444dee5-d7dc-47ed-add6-b3b1148077f2\") " pod="openshift-marketplace/certified-operators-72lxc" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.578968 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bzbm8"] Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.580947 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.581332 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.581410 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ec8027-3683-44b2-a91a-a58f49dedbfd-catalog-content\") pod \"community-operators-bzbm8\" (UID: \"c4ec8027-3683-44b2-a91a-a58f49dedbfd\") " pod="openshift-marketplace/community-operators-bzbm8" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.581439 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ec8027-3683-44b2-a91a-a58f49dedbfd-utilities\") pod \"community-operators-bzbm8\" (UID: \"c4ec8027-3683-44b2-a91a-a58f49dedbfd\") " pod="openshift-marketplace/community-operators-bzbm8" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.581472 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nvrz\" (UniqueName: \"kubernetes.io/projected/c4ec8027-3683-44b2-a91a-a58f49dedbfd-kube-api-access-6nvrz\") pod \"community-operators-bzbm8\" (UID: \"c4ec8027-3683-44b2-a91a-a58f49dedbfd\") " pod="openshift-marketplace/community-operators-bzbm8" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.581497 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:09:49 crc kubenswrapper[4735]: E0223 00:09:49.584243 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:50.084219372 +0000 UTC m=+148.547765343 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.596967 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.603093 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.665137 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-72lxc" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.690512 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.690592 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.690636 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.690668 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ec8027-3683-44b2-a91a-a58f49dedbfd-catalog-content\") pod \"community-operators-bzbm8\" (UID: \"c4ec8027-3683-44b2-a91a-a58f49dedbfd\") " pod="openshift-marketplace/community-operators-bzbm8" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.690691 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ec8027-3683-44b2-a91a-a58f49dedbfd-utilities\") pod \"community-operators-bzbm8\" (UID: \"c4ec8027-3683-44b2-a91a-a58f49dedbfd\") " pod="openshift-marketplace/community-operators-bzbm8" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.690722 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nvrz\" (UniqueName: \"kubernetes.io/projected/c4ec8027-3683-44b2-a91a-a58f49dedbfd-kube-api-access-6nvrz\") pod \"community-operators-bzbm8\" (UID: \"c4ec8027-3683-44b2-a91a-a58f49dedbfd\") " pod="openshift-marketplace/community-operators-bzbm8" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.692015 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gcgz5"] Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.693038 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gcgz5" Feb 23 00:09:49 crc kubenswrapper[4735]: E0223 00:09:49.694241 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:50.194222229 +0000 UTC m=+148.657768200 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.694814 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ec8027-3683-44b2-a91a-a58f49dedbfd-catalog-content\") pod \"community-operators-bzbm8\" (UID: \"c4ec8027-3683-44b2-a91a-a58f49dedbfd\") " pod="openshift-marketplace/community-operators-bzbm8" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.695052 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ec8027-3683-44b2-a91a-a58f49dedbfd-utilities\") pod \"community-operators-bzbm8\" (UID: \"c4ec8027-3683-44b2-a91a-a58f49dedbfd\") " pod="openshift-marketplace/community-operators-bzbm8" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.708237 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.709184 4735 patch_prober.go:28] interesting pod/router-default-5444994796-f77lm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 00:09:49 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Feb 23 00:09:49 crc kubenswrapper[4735]: [+]process-running ok Feb 23 00:09:49 crc kubenswrapper[4735]: healthz check failed Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.709238 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f77lm" podUID="bfb40dcb-d460-44f4-a461-fc423b7aed52" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.709267 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.712731 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.743940 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nvrz\" (UniqueName: \"kubernetes.io/projected/c4ec8027-3683-44b2-a91a-a58f49dedbfd-kube-api-access-6nvrz\") pod \"community-operators-bzbm8\" (UID: \"c4ec8027-3683-44b2-a91a-a58f49dedbfd\") " pod="openshift-marketplace/community-operators-bzbm8" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.743999 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gcgz5"] Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.764651 4735 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.796412 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.796889 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpqv4\" (UniqueName: \"kubernetes.io/projected/78bcce9f-48e6-49c5-8f3f-75d8e156f6bc-kube-api-access-bpqv4\") pod \"certified-operators-gcgz5\" (UID: \"78bcce9f-48e6-49c5-8f3f-75d8e156f6bc\") " pod="openshift-marketplace/certified-operators-gcgz5" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.796941 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78bcce9f-48e6-49c5-8f3f-75d8e156f6bc-catalog-content\") pod \"certified-operators-gcgz5\" (UID: \"78bcce9f-48e6-49c5-8f3f-75d8e156f6bc\") " pod="openshift-marketplace/certified-operators-gcgz5" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.796966 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78bcce9f-48e6-49c5-8f3f-75d8e156f6bc-utilities\") pod \"certified-operators-gcgz5\" (UID: \"78bcce9f-48e6-49c5-8f3f-75d8e156f6bc\") " pod="openshift-marketplace/certified-operators-gcgz5" Feb 23 00:09:49 crc kubenswrapper[4735]: E0223 00:09:49.797061 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:50.297044355 +0000 UTC m=+148.760590326 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.801706 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.867301 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bzbm8" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.898680 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78bcce9f-48e6-49c5-8f3f-75d8e156f6bc-catalog-content\") pod \"certified-operators-gcgz5\" (UID: \"78bcce9f-48e6-49c5-8f3f-75d8e156f6bc\") " pod="openshift-marketplace/certified-operators-gcgz5" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.898734 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78bcce9f-48e6-49c5-8f3f-75d8e156f6bc-utilities\") pod \"certified-operators-gcgz5\" (UID: \"78bcce9f-48e6-49c5-8f3f-75d8e156f6bc\") " pod="openshift-marketplace/certified-operators-gcgz5" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.898773 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.898806 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpqv4\" (UniqueName: \"kubernetes.io/projected/78bcce9f-48e6-49c5-8f3f-75d8e156f6bc-kube-api-access-bpqv4\") pod \"certified-operators-gcgz5\" (UID: \"78bcce9f-48e6-49c5-8f3f-75d8e156f6bc\") " pod="openshift-marketplace/certified-operators-gcgz5" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.899493 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78bcce9f-48e6-49c5-8f3f-75d8e156f6bc-catalog-content\") pod \"certified-operators-gcgz5\" (UID: \"78bcce9f-48e6-49c5-8f3f-75d8e156f6bc\") " pod="openshift-marketplace/certified-operators-gcgz5" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.899721 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78bcce9f-48e6-49c5-8f3f-75d8e156f6bc-utilities\") pod \"certified-operators-gcgz5\" (UID: \"78bcce9f-48e6-49c5-8f3f-75d8e156f6bc\") " pod="openshift-marketplace/certified-operators-gcgz5" Feb 23 00:09:49 crc kubenswrapper[4735]: E0223 00:09:49.899977 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:50.399966182 +0000 UTC m=+148.863512153 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.960804 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpqv4\" (UniqueName: \"kubernetes.io/projected/78bcce9f-48e6-49c5-8f3f-75d8e156f6bc-kube-api-access-bpqv4\") pod \"certified-operators-gcgz5\" (UID: \"78bcce9f-48e6-49c5-8f3f-75d8e156f6bc\") " pod="openshift-marketplace/certified-operators-gcgz5" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.996157 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:09:49 crc kubenswrapper[4735]: I0223 00:09:49.999245 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:49 crc kubenswrapper[4735]: E0223 00:09:49.999544 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:50.499528938 +0000 UTC m=+148.963074909 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.101053 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:50 crc kubenswrapper[4735]: E0223 00:09:50.101354 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:50.601341549 +0000 UTC m=+149.064887520 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.114215 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gcgz5" Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.121754 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l4mrf"] Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.131952 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.132592 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 00:09:50 crc kubenswrapper[4735]: W0223 00:09:50.143131 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17e19891_1a63_4c0e_abd9_a161f96cf71e.slice/crio-bce3cf1157e9325a201341711515fc1ac739dac84d8b64a31318d22f03be1e03 WatchSource:0}: Error finding container bce3cf1157e9325a201341711515fc1ac739dac84d8b64a31318d22f03be1e03: Status 404 returned error can't find the container with id bce3cf1157e9325a201341711515fc1ac739dac84d8b64a31318d22f03be1e03 Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.143522 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.143573 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.158890 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.205807 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.206298 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17b02993-5531-450d-8056-8b3e9a7bc1d2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"17b02993-5531-450d-8056-8b3e9a7bc1d2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.206355 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17b02993-5531-450d-8056-8b3e9a7bc1d2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"17b02993-5531-450d-8056-8b3e9a7bc1d2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 00:09:50 crc kubenswrapper[4735]: E0223 00:09:50.206459 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:50.706443699 +0000 UTC m=+149.169989670 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.309946 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17b02993-5531-450d-8056-8b3e9a7bc1d2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"17b02993-5531-450d-8056-8b3e9a7bc1d2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.310391 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17b02993-5531-450d-8056-8b3e9a7bc1d2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"17b02993-5531-450d-8056-8b3e9a7bc1d2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.310429 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17b02993-5531-450d-8056-8b3e9a7bc1d2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"17b02993-5531-450d-8056-8b3e9a7bc1d2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.310469 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:50 crc kubenswrapper[4735]: E0223 00:09:50.310705 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:50.810695089 +0000 UTC m=+149.274241060 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.331049 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-72lxc"] Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.384389 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17b02993-5531-450d-8056-8b3e9a7bc1d2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"17b02993-5531-450d-8056-8b3e9a7bc1d2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.411797 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:50 crc kubenswrapper[4735]: E0223 00:09:50.412176 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:50.912160981 +0000 UTC m=+149.375706952 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.477382 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vh89z" event={"ID":"9af6d235-3d3f-49c9-9338-e7302ef9deae","Type":"ContainerStarted","Data":"3e12d681ed5f7e0d7b458a4a8c98aed75e94c99e492d90f7e28a7fc9a74833db"} Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.492966 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.492994 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4mrf" event={"ID":"17e19891-1a63-4c0e-abd9-a161f96cf71e","Type":"ContainerStarted","Data":"bce3cf1157e9325a201341711515fc1ac739dac84d8b64a31318d22f03be1e03"} Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.514825 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:50 crc kubenswrapper[4735]: E0223 00:09:50.516840 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-23 00:09:51.01682383 +0000 UTC m=+149.480369801 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-6lsk2" (UID: "769cd336-e909-4164-89c9-e0874926fd3d") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.536602 4735 generic.go:334] "Generic (PLEG): container finished" podID="845be7d4-ea70-441f-8af2-c44870256906" containerID="28c45b2cbdcf3d8161c129cfc8c3d5e54fc40e166f96deaabb0661115f59ad2d" exitCode=0 Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.536891 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530080-85k47" event={"ID":"845be7d4-ea70-441f-8af2-c44870256906","Type":"ContainerDied","Data":"28c45b2cbdcf3d8161c129cfc8c3d5e54fc40e166f96deaabb0661115f59ad2d"} Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.543200 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-vh89z" podStartSLOduration=10.543173735 podStartE2EDuration="10.543173735s" podCreationTimestamp="2026-02-23 00:09:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:50.527592329 +0000 UTC m=+148.991138300" watchObservedRunningTime="2026-02-23 00:09:50.543173735 +0000 UTC m=+149.006719706" Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.615904 4735 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-23T00:09:49.764681206Z","Handler":null,"Name":""} Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.619056 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:50 crc kubenswrapper[4735]: E0223 00:09:50.622928 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-23 00:09:51.122906444 +0000 UTC m=+149.586452415 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.623137 4735 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.623163 4735 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.704515 4735 patch_prober.go:28] interesting pod/router-default-5444994796-f77lm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 00:09:50 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Feb 23 00:09:50 crc kubenswrapper[4735]: [+]process-running ok Feb 23 00:09:50 crc kubenswrapper[4735]: healthz check failed Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.704562 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f77lm" podUID="bfb40dcb-d460-44f4-a461-fc423b7aed52" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 00:09:50 crc kubenswrapper[4735]: W0223 00:09:50.713412 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-145d9db07f347d6f59d1da519aaec4ca9df1a8013adc9168710f47d65a4a7f0e WatchSource:0}: Error finding container 145d9db07f347d6f59d1da519aaec4ca9df1a8013adc9168710f47d65a4a7f0e: Status 404 returned error can't find the container with id 145d9db07f347d6f59d1da519aaec4ca9df1a8013adc9168710f47d65a4a7f0e Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.724116 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.739266 4735 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.739706 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.812160 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-6lsk2\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.825003 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 23 00:09:50 crc kubenswrapper[4735]: I0223 00:09:50.841802 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.054554 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.166919 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gcgz5"] Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.183015 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 23 00:09:51 crc kubenswrapper[4735]: W0223 00:09:51.189553 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78bcce9f_48e6_49c5_8f3f_75d8e156f6bc.slice/crio-316cfff5800b138d553bf66a05e7880d2e3155c1b78fa3da05e4a41d28d0b2ff WatchSource:0}: Error finding container 316cfff5800b138d553bf66a05e7880d2e3155c1b78fa3da05e4a41d28d0b2ff: Status 404 returned error can't find the container with id 316cfff5800b138d553bf66a05e7880d2e3155c1b78fa3da05e4a41d28d0b2ff Feb 23 00:09:51 crc kubenswrapper[4735]: W0223 00:09:51.207985 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod17b02993_5531_450d_8056_8b3e9a7bc1d2.slice/crio-dbb6b565f4b8450c20532faa23b659d55b34c927b1a8f337f5844bf2f0e38e3c WatchSource:0}: Error finding container dbb6b565f4b8450c20532faa23b659d55b34c927b1a8f337f5844bf2f0e38e3c: Status 404 returned error can't find the container with id dbb6b565f4b8450c20532faa23b659d55b34c927b1a8f337f5844bf2f0e38e3c Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.249568 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4gt4h"] Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.250503 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4gt4h" Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.253801 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.268190 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4gt4h"] Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.335973 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bac7145-e696-4926-8d9a-de30ef0c6209-catalog-content\") pod \"redhat-marketplace-4gt4h\" (UID: \"2bac7145-e696-4926-8d9a-de30ef0c6209\") " pod="openshift-marketplace/redhat-marketplace-4gt4h" Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.336354 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bac7145-e696-4926-8d9a-de30ef0c6209-utilities\") pod \"redhat-marketplace-4gt4h\" (UID: \"2bac7145-e696-4926-8d9a-de30ef0c6209\") " pod="openshift-marketplace/redhat-marketplace-4gt4h" Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.336421 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz5sq\" (UniqueName: \"kubernetes.io/projected/2bac7145-e696-4926-8d9a-de30ef0c6209-kube-api-access-wz5sq\") pod \"redhat-marketplace-4gt4h\" (UID: \"2bac7145-e696-4926-8d9a-de30ef0c6209\") " pod="openshift-marketplace/redhat-marketplace-4gt4h" Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.361225 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bzbm8"] Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.373666 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6lsk2"] Feb 23 00:09:51 crc kubenswrapper[4735]: W0223 00:09:51.381210 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4ec8027_3683_44b2_a91a_a58f49dedbfd.slice/crio-90856c0643b608717c06400d36d7b1e98305cd45913327c9412d8b0b23d68ba9 WatchSource:0}: Error finding container 90856c0643b608717c06400d36d7b1e98305cd45913327c9412d8b0b23d68ba9: Status 404 returned error can't find the container with id 90856c0643b608717c06400d36d7b1e98305cd45913327c9412d8b0b23d68ba9 Feb 23 00:09:51 crc kubenswrapper[4735]: W0223 00:09:51.385724 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod769cd336_e909_4164_89c9_e0874926fd3d.slice/crio-8ce72b324ba51812a49f5b49552e6fe628b33d566a75b5a6efd4b6b301f0295b WatchSource:0}: Error finding container 8ce72b324ba51812a49f5b49552e6fe628b33d566a75b5a6efd4b6b301f0295b: Status 404 returned error can't find the container with id 8ce72b324ba51812a49f5b49552e6fe628b33d566a75b5a6efd4b6b301f0295b Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.437058 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz5sq\" (UniqueName: \"kubernetes.io/projected/2bac7145-e696-4926-8d9a-de30ef0c6209-kube-api-access-wz5sq\") pod \"redhat-marketplace-4gt4h\" (UID: \"2bac7145-e696-4926-8d9a-de30ef0c6209\") " pod="openshift-marketplace/redhat-marketplace-4gt4h" Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.437114 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bac7145-e696-4926-8d9a-de30ef0c6209-catalog-content\") pod \"redhat-marketplace-4gt4h\" (UID: \"2bac7145-e696-4926-8d9a-de30ef0c6209\") " pod="openshift-marketplace/redhat-marketplace-4gt4h" Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.437144 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bac7145-e696-4926-8d9a-de30ef0c6209-utilities\") pod \"redhat-marketplace-4gt4h\" (UID: \"2bac7145-e696-4926-8d9a-de30ef0c6209\") " pod="openshift-marketplace/redhat-marketplace-4gt4h" Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.437673 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bac7145-e696-4926-8d9a-de30ef0c6209-utilities\") pod \"redhat-marketplace-4gt4h\" (UID: \"2bac7145-e696-4926-8d9a-de30ef0c6209\") " pod="openshift-marketplace/redhat-marketplace-4gt4h" Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.438388 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bac7145-e696-4926-8d9a-de30ef0c6209-catalog-content\") pod \"redhat-marketplace-4gt4h\" (UID: \"2bac7145-e696-4926-8d9a-de30ef0c6209\") " pod="openshift-marketplace/redhat-marketplace-4gt4h" Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.465000 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz5sq\" (UniqueName: \"kubernetes.io/projected/2bac7145-e696-4926-8d9a-de30ef0c6209-kube-api-access-wz5sq\") pod \"redhat-marketplace-4gt4h\" (UID: \"2bac7145-e696-4926-8d9a-de30ef0c6209\") " pod="openshift-marketplace/redhat-marketplace-4gt4h" Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.559768 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bzbm8" event={"ID":"c4ec8027-3683-44b2-a91a-a58f49dedbfd","Type":"ContainerStarted","Data":"90856c0643b608717c06400d36d7b1e98305cd45913327c9412d8b0b23d68ba9"} Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.561007 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"17b02993-5531-450d-8056-8b3e9a7bc1d2","Type":"ContainerStarted","Data":"dbb6b565f4b8450c20532faa23b659d55b34c927b1a8f337f5844bf2f0e38e3c"} Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.563204 4735 generic.go:334] "Generic (PLEG): container finished" podID="78bcce9f-48e6-49c5-8f3f-75d8e156f6bc" containerID="075a64faff6b67ef7d4a1d9454f9226618ac992b395f4daa76577f30908af33d" exitCode=0 Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.563325 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gcgz5" event={"ID":"78bcce9f-48e6-49c5-8f3f-75d8e156f6bc","Type":"ContainerDied","Data":"075a64faff6b67ef7d4a1d9454f9226618ac992b395f4daa76577f30908af33d"} Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.563350 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gcgz5" event={"ID":"78bcce9f-48e6-49c5-8f3f-75d8e156f6bc","Type":"ContainerStarted","Data":"316cfff5800b138d553bf66a05e7880d2e3155c1b78fa3da05e4a41d28d0b2ff"} Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.565765 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.566598 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"0890ac0c0d84d05b4cbc5e637c9872a24bf3ffcd60b59b125b3b6a22e965a299"} Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.566650 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"b9d2f690d45b28c50a88391aec6c3d81644252fc07339803c66237aa1b279483"} Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.570173 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"9f615e80f30f9cfbae2061f06fa59523812aa3aed394f3ad606e0ec7a9bb16db"} Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.570233 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"145d9db07f347d6f59d1da519aaec4ca9df1a8013adc9168710f47d65a4a7f0e"} Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.571840 4735 generic.go:334] "Generic (PLEG): container finished" podID="f444dee5-d7dc-47ed-add6-b3b1148077f2" containerID="3dbc5778e6645d9d173241cf310187e13a4d5d431d19e3ba8b7a95dd2fdea9be" exitCode=0 Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.571926 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72lxc" event={"ID":"f444dee5-d7dc-47ed-add6-b3b1148077f2","Type":"ContainerDied","Data":"3dbc5778e6645d9d173241cf310187e13a4d5d431d19e3ba8b7a95dd2fdea9be"} Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.571949 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72lxc" event={"ID":"f444dee5-d7dc-47ed-add6-b3b1148077f2","Type":"ContainerStarted","Data":"a8e9a0524209a9712f0f446ee419d2546d688d639a6737975c44ffa269457568"} Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.575074 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" event={"ID":"769cd336-e909-4164-89c9-e0874926fd3d","Type":"ContainerStarted","Data":"8ce72b324ba51812a49f5b49552e6fe628b33d566a75b5a6efd4b6b301f0295b"} Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.575139 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.578316 4735 generic.go:334] "Generic (PLEG): container finished" podID="17e19891-1a63-4c0e-abd9-a161f96cf71e" containerID="2906412ded4239feec64b43dd70a9c33432d07e9bf8e28608c56606e91821364" exitCode=0 Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.578387 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4mrf" event={"ID":"17e19891-1a63-4c0e-abd9-a161f96cf71e","Type":"ContainerDied","Data":"2906412ded4239feec64b43dd70a9c33432d07e9bf8e28608c56606e91821364"} Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.581908 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"988cf8b39b49092064b2c8f1df1324dac0b52a5f80d51c9ad0008e1d332f09fc"} Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.581937 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ca3876d6a6dd9f3ac17e4a8ffd97868992bfe79348fdb2880cfe563889bb6865"} Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.592964 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4gt4h" Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.651927 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-74kxk"] Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.652952 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-74kxk" Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.672563 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-74kxk"] Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.695408 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" podStartSLOduration=122.695387919 podStartE2EDuration="2m2.695387919s" podCreationTimestamp="2026-02-23 00:07:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:09:51.693413211 +0000 UTC m=+150.156959182" watchObservedRunningTime="2026-02-23 00:09:51.695387919 +0000 UTC m=+150.158933890" Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.715583 4735 patch_prober.go:28] interesting pod/router-default-5444994796-f77lm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 00:09:51 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Feb 23 00:09:51 crc kubenswrapper[4735]: [+]process-running ok Feb 23 00:09:51 crc kubenswrapper[4735]: healthz check failed Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.715638 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f77lm" podUID="bfb40dcb-d460-44f4-a461-fc423b7aed52" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.715745 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.844411 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e33bc37-4c54-4f50-95b7-bd2cf2da176e-utilities\") pod \"redhat-marketplace-74kxk\" (UID: \"0e33bc37-4c54-4f50-95b7-bd2cf2da176e\") " pod="openshift-marketplace/redhat-marketplace-74kxk" Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.844822 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e33bc37-4c54-4f50-95b7-bd2cf2da176e-catalog-content\") pod \"redhat-marketplace-74kxk\" (UID: \"0e33bc37-4c54-4f50-95b7-bd2cf2da176e\") " pod="openshift-marketplace/redhat-marketplace-74kxk" Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.844953 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25n9n\" (UniqueName: \"kubernetes.io/projected/0e33bc37-4c54-4f50-95b7-bd2cf2da176e-kube-api-access-25n9n\") pod \"redhat-marketplace-74kxk\" (UID: \"0e33bc37-4c54-4f50-95b7-bd2cf2da176e\") " pod="openshift-marketplace/redhat-marketplace-74kxk" Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.946488 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e33bc37-4c54-4f50-95b7-bd2cf2da176e-utilities\") pod \"redhat-marketplace-74kxk\" (UID: \"0e33bc37-4c54-4f50-95b7-bd2cf2da176e\") " pod="openshift-marketplace/redhat-marketplace-74kxk" Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.946549 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e33bc37-4c54-4f50-95b7-bd2cf2da176e-catalog-content\") pod \"redhat-marketplace-74kxk\" (UID: \"0e33bc37-4c54-4f50-95b7-bd2cf2da176e\") " pod="openshift-marketplace/redhat-marketplace-74kxk" Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.946589 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25n9n\" (UniqueName: \"kubernetes.io/projected/0e33bc37-4c54-4f50-95b7-bd2cf2da176e-kube-api-access-25n9n\") pod \"redhat-marketplace-74kxk\" (UID: \"0e33bc37-4c54-4f50-95b7-bd2cf2da176e\") " pod="openshift-marketplace/redhat-marketplace-74kxk" Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.947323 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e33bc37-4c54-4f50-95b7-bd2cf2da176e-utilities\") pod \"redhat-marketplace-74kxk\" (UID: \"0e33bc37-4c54-4f50-95b7-bd2cf2da176e\") " pod="openshift-marketplace/redhat-marketplace-74kxk" Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.947413 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e33bc37-4c54-4f50-95b7-bd2cf2da176e-catalog-content\") pod \"redhat-marketplace-74kxk\" (UID: \"0e33bc37-4c54-4f50-95b7-bd2cf2da176e\") " pod="openshift-marketplace/redhat-marketplace-74kxk" Feb 23 00:09:51 crc kubenswrapper[4735]: I0223 00:09:51.979817 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25n9n\" (UniqueName: \"kubernetes.io/projected/0e33bc37-4c54-4f50-95b7-bd2cf2da176e-kube-api-access-25n9n\") pod \"redhat-marketplace-74kxk\" (UID: \"0e33bc37-4c54-4f50-95b7-bd2cf2da176e\") " pod="openshift-marketplace/redhat-marketplace-74kxk" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.022793 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530080-85k47" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.057486 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4gt4h"] Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.147951 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qhf9\" (UniqueName: \"kubernetes.io/projected/845be7d4-ea70-441f-8af2-c44870256906-kube-api-access-8qhf9\") pod \"845be7d4-ea70-441f-8af2-c44870256906\" (UID: \"845be7d4-ea70-441f-8af2-c44870256906\") " Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.148089 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/845be7d4-ea70-441f-8af2-c44870256906-config-volume\") pod \"845be7d4-ea70-441f-8af2-c44870256906\" (UID: \"845be7d4-ea70-441f-8af2-c44870256906\") " Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.148115 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/845be7d4-ea70-441f-8af2-c44870256906-secret-volume\") pod \"845be7d4-ea70-441f-8af2-c44870256906\" (UID: \"845be7d4-ea70-441f-8af2-c44870256906\") " Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.150005 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/845be7d4-ea70-441f-8af2-c44870256906-config-volume" (OuterVolumeSpecName: "config-volume") pod "845be7d4-ea70-441f-8af2-c44870256906" (UID: "845be7d4-ea70-441f-8af2-c44870256906"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.152302 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/845be7d4-ea70-441f-8af2-c44870256906-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "845be7d4-ea70-441f-8af2-c44870256906" (UID: "845be7d4-ea70-441f-8af2-c44870256906"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.152397 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/845be7d4-ea70-441f-8af2-c44870256906-kube-api-access-8qhf9" (OuterVolumeSpecName: "kube-api-access-8qhf9") pod "845be7d4-ea70-441f-8af2-c44870256906" (UID: "845be7d4-ea70-441f-8af2-c44870256906"). InnerVolumeSpecName "kube-api-access-8qhf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.249442 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/845be7d4-ea70-441f-8af2-c44870256906-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.249480 4735 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/845be7d4-ea70-441f-8af2-c44870256906-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.249492 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qhf9\" (UniqueName: \"kubernetes.io/projected/845be7d4-ea70-441f-8af2-c44870256906-kube-api-access-8qhf9\") on node \"crc\" DevicePath \"\"" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.253812 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mvwmg"] Feb 23 00:09:52 crc kubenswrapper[4735]: E0223 00:09:52.254596 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="845be7d4-ea70-441f-8af2-c44870256906" containerName="collect-profiles" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.254646 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="845be7d4-ea70-441f-8af2-c44870256906" containerName="collect-profiles" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.254807 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="845be7d4-ea70-441f-8af2-c44870256906" containerName="collect-profiles" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.255706 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mvwmg" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.260283 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.274267 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-74kxk" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.312516 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.314096 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mvwmg"] Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.351676 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzbsx\" (UniqueName: \"kubernetes.io/projected/703af3dd-f895-4e96-991a-7e8a405bb03e-kube-api-access-fzbsx\") pod \"redhat-operators-mvwmg\" (UID: \"703af3dd-f895-4e96-991a-7e8a405bb03e\") " pod="openshift-marketplace/redhat-operators-mvwmg" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.351743 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/703af3dd-f895-4e96-991a-7e8a405bb03e-catalog-content\") pod \"redhat-operators-mvwmg\" (UID: \"703af3dd-f895-4e96-991a-7e8a405bb03e\") " pod="openshift-marketplace/redhat-operators-mvwmg" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.351770 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/703af3dd-f895-4e96-991a-7e8a405bb03e-utilities\") pod \"redhat-operators-mvwmg\" (UID: \"703af3dd-f895-4e96-991a-7e8a405bb03e\") " pod="openshift-marketplace/redhat-operators-mvwmg" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.478543 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzbsx\" (UniqueName: \"kubernetes.io/projected/703af3dd-f895-4e96-991a-7e8a405bb03e-kube-api-access-fzbsx\") pod \"redhat-operators-mvwmg\" (UID: \"703af3dd-f895-4e96-991a-7e8a405bb03e\") " pod="openshift-marketplace/redhat-operators-mvwmg" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.478597 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/703af3dd-f895-4e96-991a-7e8a405bb03e-catalog-content\") pod \"redhat-operators-mvwmg\" (UID: \"703af3dd-f895-4e96-991a-7e8a405bb03e\") " pod="openshift-marketplace/redhat-operators-mvwmg" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.478619 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/703af3dd-f895-4e96-991a-7e8a405bb03e-utilities\") pod \"redhat-operators-mvwmg\" (UID: \"703af3dd-f895-4e96-991a-7e8a405bb03e\") " pod="openshift-marketplace/redhat-operators-mvwmg" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.479552 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/703af3dd-f895-4e96-991a-7e8a405bb03e-utilities\") pod \"redhat-operators-mvwmg\" (UID: \"703af3dd-f895-4e96-991a-7e8a405bb03e\") " pod="openshift-marketplace/redhat-operators-mvwmg" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.480027 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/703af3dd-f895-4e96-991a-7e8a405bb03e-catalog-content\") pod \"redhat-operators-mvwmg\" (UID: \"703af3dd-f895-4e96-991a-7e8a405bb03e\") " pod="openshift-marketplace/redhat-operators-mvwmg" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.504923 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzbsx\" (UniqueName: \"kubernetes.io/projected/703af3dd-f895-4e96-991a-7e8a405bb03e-kube-api-access-fzbsx\") pod \"redhat-operators-mvwmg\" (UID: \"703af3dd-f895-4e96-991a-7e8a405bb03e\") " pod="openshift-marketplace/redhat-operators-mvwmg" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.572203 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mvwmg" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.613190 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" event={"ID":"769cd336-e909-4164-89c9-e0874926fd3d","Type":"ContainerStarted","Data":"f3b3df2f386cc3a232b61cb7782c538a5978f91757cf15847426019748f25a92"} Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.616897 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530080-85k47" event={"ID":"845be7d4-ea70-441f-8af2-c44870256906","Type":"ContainerDied","Data":"a8d6e74bd420642313c860dd5a435255729bb10dad9746b51017fceefe49778e"} Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.616921 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8d6e74bd420642313c860dd5a435255729bb10dad9746b51017fceefe49778e" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.617009 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530080-85k47" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.619351 4735 generic.go:334] "Generic (PLEG): container finished" podID="2bac7145-e696-4926-8d9a-de30ef0c6209" containerID="a6802770d7830967bfb75e75172a24a6d189aece352443af0e4e72246b51ab73" exitCode=0 Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.619408 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4gt4h" event={"ID":"2bac7145-e696-4926-8d9a-de30ef0c6209","Type":"ContainerDied","Data":"a6802770d7830967bfb75e75172a24a6d189aece352443af0e4e72246b51ab73"} Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.619422 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4gt4h" event={"ID":"2bac7145-e696-4926-8d9a-de30ef0c6209","Type":"ContainerStarted","Data":"ef39aefa5580de7ff0a0adf13684d3abbb74b26cbcdd72e216b62bb63dd77f8d"} Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.625802 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-74kxk"] Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.666791 4735 generic.go:334] "Generic (PLEG): container finished" podID="c4ec8027-3683-44b2-a91a-a58f49dedbfd" containerID="2e1a361143ec447e2ad086d99d64b28ef90ea60462ce1fc4cfa11d8b8699c7ca" exitCode=0 Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.666871 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bzbm8" event={"ID":"c4ec8027-3683-44b2-a91a-a58f49dedbfd","Type":"ContainerDied","Data":"2e1a361143ec447e2ad086d99d64b28ef90ea60462ce1fc4cfa11d8b8699c7ca"} Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.674306 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5xl8z"] Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.675637 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5xl8z" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.678124 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5xl8z"] Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.685996 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b4dd4f4-5ead-4f6f-a993-34effb6df863-utilities\") pod \"redhat-operators-5xl8z\" (UID: \"6b4dd4f4-5ead-4f6f-a993-34effb6df863\") " pod="openshift-marketplace/redhat-operators-5xl8z" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.686066 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kntm8\" (UniqueName: \"kubernetes.io/projected/6b4dd4f4-5ead-4f6f-a993-34effb6df863-kube-api-access-kntm8\") pod \"redhat-operators-5xl8z\" (UID: \"6b4dd4f4-5ead-4f6f-a993-34effb6df863\") " pod="openshift-marketplace/redhat-operators-5xl8z" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.686108 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b4dd4f4-5ead-4f6f-a993-34effb6df863-catalog-content\") pod \"redhat-operators-5xl8z\" (UID: \"6b4dd4f4-5ead-4f6f-a993-34effb6df863\") " pod="openshift-marketplace/redhat-operators-5xl8z" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.704801 4735 generic.go:334] "Generic (PLEG): container finished" podID="17b02993-5531-450d-8056-8b3e9a7bc1d2" containerID="ca9b64100bf8c61cef0f1cac2f6c67137ef345f8baf079140e4a33fee8bb86dd" exitCode=0 Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.704866 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"17b02993-5531-450d-8056-8b3e9a7bc1d2","Type":"ContainerDied","Data":"ca9b64100bf8c61cef0f1cac2f6c67137ef345f8baf079140e4a33fee8bb86dd"} Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.716066 4735 patch_prober.go:28] interesting pod/router-default-5444994796-f77lm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 00:09:52 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Feb 23 00:09:52 crc kubenswrapper[4735]: [+]process-running ok Feb 23 00:09:52 crc kubenswrapper[4735]: healthz check failed Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.716136 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f77lm" podUID="bfb40dcb-d460-44f4-a461-fc423b7aed52" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.789878 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b4dd4f4-5ead-4f6f-a993-34effb6df863-catalog-content\") pod \"redhat-operators-5xl8z\" (UID: \"6b4dd4f4-5ead-4f6f-a993-34effb6df863\") " pod="openshift-marketplace/redhat-operators-5xl8z" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.789931 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b4dd4f4-5ead-4f6f-a993-34effb6df863-catalog-content\") pod \"redhat-operators-5xl8z\" (UID: \"6b4dd4f4-5ead-4f6f-a993-34effb6df863\") " pod="openshift-marketplace/redhat-operators-5xl8z" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.790016 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b4dd4f4-5ead-4f6f-a993-34effb6df863-utilities\") pod \"redhat-operators-5xl8z\" (UID: \"6b4dd4f4-5ead-4f6f-a993-34effb6df863\") " pod="openshift-marketplace/redhat-operators-5xl8z" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.790069 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kntm8\" (UniqueName: \"kubernetes.io/projected/6b4dd4f4-5ead-4f6f-a993-34effb6df863-kube-api-access-kntm8\") pod \"redhat-operators-5xl8z\" (UID: \"6b4dd4f4-5ead-4f6f-a993-34effb6df863\") " pod="openshift-marketplace/redhat-operators-5xl8z" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.791940 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b4dd4f4-5ead-4f6f-a993-34effb6df863-utilities\") pod \"redhat-operators-5xl8z\" (UID: \"6b4dd4f4-5ead-4f6f-a993-34effb6df863\") " pod="openshift-marketplace/redhat-operators-5xl8z" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.799956 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.800304 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.816321 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kntm8\" (UniqueName: \"kubernetes.io/projected/6b4dd4f4-5ead-4f6f-a993-34effb6df863-kube-api-access-kntm8\") pod \"redhat-operators-5xl8z\" (UID: \"6b4dd4f4-5ead-4f6f-a993-34effb6df863\") " pod="openshift-marketplace/redhat-operators-5xl8z" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.818517 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.870774 4735 patch_prober.go:28] interesting pod/downloads-7954f5f757-dsqjf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.871126 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dsqjf" podUID="b40b3484-891d-4669-838b-90c1b8d6869e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.871363 4735 patch_prober.go:28] interesting pod/downloads-7954f5f757-dsqjf container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.871434 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-dsqjf" podUID="b40b3484-891d-4669-838b-90c1b8d6869e" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.885375 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-g5xff" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.885424 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-g5xff" Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.893077 4735 patch_prober.go:28] interesting pod/console-f9d7485db-g5xff container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Feb 23 00:09:52 crc kubenswrapper[4735]: I0223 00:09:52.893118 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-g5xff" podUID="92d294e9-091b-420f-aae8-da3bcab119e4" containerName="console" probeResult="failure" output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" Feb 23 00:09:53 crc kubenswrapper[4735]: I0223 00:09:53.058394 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5xl8z" Feb 23 00:09:53 crc kubenswrapper[4735]: I0223 00:09:53.071204 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:53 crc kubenswrapper[4735]: I0223 00:09:53.071281 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:53 crc kubenswrapper[4735]: I0223 00:09:53.101817 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:53 crc kubenswrapper[4735]: I0223 00:09:53.194325 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mvwmg"] Feb 23 00:09:53 crc kubenswrapper[4735]: I0223 00:09:53.399230 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5xl8z"] Feb 23 00:09:53 crc kubenswrapper[4735]: W0223 00:09:53.429878 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b4dd4f4_5ead_4f6f_a993_34effb6df863.slice/crio-a82dd4bb37a675b0056a2017fc7d3f0ce1ceab08fbe58618cd5030097425f57d WatchSource:0}: Error finding container a82dd4bb37a675b0056a2017fc7d3f0ce1ceab08fbe58618cd5030097425f57d: Status 404 returned error can't find the container with id a82dd4bb37a675b0056a2017fc7d3f0ce1ceab08fbe58618cd5030097425f57d Feb 23 00:09:53 crc kubenswrapper[4735]: I0223 00:09:53.695303 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-f77lm" Feb 23 00:09:53 crc kubenswrapper[4735]: I0223 00:09:53.700488 4735 patch_prober.go:28] interesting pod/router-default-5444994796-f77lm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 00:09:53 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Feb 23 00:09:53 crc kubenswrapper[4735]: [+]process-running ok Feb 23 00:09:53 crc kubenswrapper[4735]: healthz check failed Feb 23 00:09:53 crc kubenswrapper[4735]: I0223 00:09:53.700540 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f77lm" podUID="bfb40dcb-d460-44f4-a461-fc423b7aed52" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 00:09:53 crc kubenswrapper[4735]: I0223 00:09:53.717052 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5xl8z" event={"ID":"6b4dd4f4-5ead-4f6f-a993-34effb6df863","Type":"ContainerStarted","Data":"4bf02ec4c27f762900c43e5d6f61665a6dd764f6f5045271f71b90a55f54e852"} Feb 23 00:09:53 crc kubenswrapper[4735]: I0223 00:09:53.717099 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5xl8z" event={"ID":"6b4dd4f4-5ead-4f6f-a993-34effb6df863","Type":"ContainerStarted","Data":"a82dd4bb37a675b0056a2017fc7d3f0ce1ceab08fbe58618cd5030097425f57d"} Feb 23 00:09:53 crc kubenswrapper[4735]: I0223 00:09:53.723954 4735 generic.go:334] "Generic (PLEG): container finished" podID="0e33bc37-4c54-4f50-95b7-bd2cf2da176e" containerID="1b5e2521901e4676f48813d4d5cd59f8085da07bea1a63541f3a7349f76b9c4c" exitCode=0 Feb 23 00:09:53 crc kubenswrapper[4735]: I0223 00:09:53.724004 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-74kxk" event={"ID":"0e33bc37-4c54-4f50-95b7-bd2cf2da176e","Type":"ContainerDied","Data":"1b5e2521901e4676f48813d4d5cd59f8085da07bea1a63541f3a7349f76b9c4c"} Feb 23 00:09:53 crc kubenswrapper[4735]: I0223 00:09:53.724024 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-74kxk" event={"ID":"0e33bc37-4c54-4f50-95b7-bd2cf2da176e","Type":"ContainerStarted","Data":"b8fef3900c6546e311dd961fa0a0c81f83d4c53e096975eab1990148820d32ac"} Feb 23 00:09:53 crc kubenswrapper[4735]: I0223 00:09:53.729321 4735 generic.go:334] "Generic (PLEG): container finished" podID="703af3dd-f895-4e96-991a-7e8a405bb03e" containerID="ff364cb769ee3e3749210e14bd4be3e6cddec74f6ae80a77511f10d6b65772d8" exitCode=0 Feb 23 00:09:53 crc kubenswrapper[4735]: I0223 00:09:53.729785 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvwmg" event={"ID":"703af3dd-f895-4e96-991a-7e8a405bb03e","Type":"ContainerDied","Data":"ff364cb769ee3e3749210e14bd4be3e6cddec74f6ae80a77511f10d6b65772d8"} Feb 23 00:09:53 crc kubenswrapper[4735]: I0223 00:09:53.729814 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvwmg" event={"ID":"703af3dd-f895-4e96-991a-7e8a405bb03e","Type":"ContainerStarted","Data":"fefd497998a02bb20028a7db5e8b44cccfe20ad226dcb747e963b36259a35adc"} Feb 23 00:09:53 crc kubenswrapper[4735]: I0223 00:09:53.734905 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-g2t27" Feb 23 00:09:53 crc kubenswrapper[4735]: I0223 00:09:53.736055 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-87xcv" Feb 23 00:09:54 crc kubenswrapper[4735]: I0223 00:09:54.163934 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 00:09:54 crc kubenswrapper[4735]: I0223 00:09:54.325355 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17b02993-5531-450d-8056-8b3e9a7bc1d2-kubelet-dir\") pod \"17b02993-5531-450d-8056-8b3e9a7bc1d2\" (UID: \"17b02993-5531-450d-8056-8b3e9a7bc1d2\") " Feb 23 00:09:54 crc kubenswrapper[4735]: I0223 00:09:54.325402 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17b02993-5531-450d-8056-8b3e9a7bc1d2-kube-api-access\") pod \"17b02993-5531-450d-8056-8b3e9a7bc1d2\" (UID: \"17b02993-5531-450d-8056-8b3e9a7bc1d2\") " Feb 23 00:09:54 crc kubenswrapper[4735]: I0223 00:09:54.326653 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/17b02993-5531-450d-8056-8b3e9a7bc1d2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "17b02993-5531-450d-8056-8b3e9a7bc1d2" (UID: "17b02993-5531-450d-8056-8b3e9a7bc1d2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 00:09:54 crc kubenswrapper[4735]: I0223 00:09:54.330772 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17b02993-5531-450d-8056-8b3e9a7bc1d2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "17b02993-5531-450d-8056-8b3e9a7bc1d2" (UID: "17b02993-5531-450d-8056-8b3e9a7bc1d2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:09:54 crc kubenswrapper[4735]: I0223 00:09:54.427262 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17b02993-5531-450d-8056-8b3e9a7bc1d2-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 23 00:09:54 crc kubenswrapper[4735]: I0223 00:09:54.427312 4735 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/17b02993-5531-450d-8056-8b3e9a7bc1d2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 23 00:09:54 crc kubenswrapper[4735]: I0223 00:09:54.455116 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 23 00:09:54 crc kubenswrapper[4735]: E0223 00:09:54.455386 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17b02993-5531-450d-8056-8b3e9a7bc1d2" containerName="pruner" Feb 23 00:09:54 crc kubenswrapper[4735]: I0223 00:09:54.455397 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="17b02993-5531-450d-8056-8b3e9a7bc1d2" containerName="pruner" Feb 23 00:09:54 crc kubenswrapper[4735]: I0223 00:09:54.459102 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="17b02993-5531-450d-8056-8b3e9a7bc1d2" containerName="pruner" Feb 23 00:09:54 crc kubenswrapper[4735]: I0223 00:09:54.459447 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 00:09:54 crc kubenswrapper[4735]: I0223 00:09:54.459842 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 23 00:09:54 crc kubenswrapper[4735]: I0223 00:09:54.463545 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 23 00:09:54 crc kubenswrapper[4735]: I0223 00:09:54.463774 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 23 00:09:54 crc kubenswrapper[4735]: I0223 00:09:54.629077 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/111be185-20ca-4a4d-a9d9-ead24f0d7c4b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"111be185-20ca-4a4d-a9d9-ead24f0d7c4b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 00:09:54 crc kubenswrapper[4735]: I0223 00:09:54.629270 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/111be185-20ca-4a4d-a9d9-ead24f0d7c4b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"111be185-20ca-4a4d-a9d9-ead24f0d7c4b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 00:09:54 crc kubenswrapper[4735]: I0223 00:09:54.699090 4735 patch_prober.go:28] interesting pod/router-default-5444994796-f77lm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 00:09:54 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Feb 23 00:09:54 crc kubenswrapper[4735]: [+]process-running ok Feb 23 00:09:54 crc kubenswrapper[4735]: healthz check failed Feb 23 00:09:54 crc kubenswrapper[4735]: I0223 00:09:54.699139 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f77lm" podUID="bfb40dcb-d460-44f4-a461-fc423b7aed52" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 00:09:54 crc kubenswrapper[4735]: I0223 00:09:54.731224 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/111be185-20ca-4a4d-a9d9-ead24f0d7c4b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"111be185-20ca-4a4d-a9d9-ead24f0d7c4b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 00:09:54 crc kubenswrapper[4735]: I0223 00:09:54.731372 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/111be185-20ca-4a4d-a9d9-ead24f0d7c4b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"111be185-20ca-4a4d-a9d9-ead24f0d7c4b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 00:09:54 crc kubenswrapper[4735]: I0223 00:09:54.732133 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/111be185-20ca-4a4d-a9d9-ead24f0d7c4b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"111be185-20ca-4a4d-a9d9-ead24f0d7c4b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 00:09:54 crc kubenswrapper[4735]: I0223 00:09:54.742009 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"17b02993-5531-450d-8056-8b3e9a7bc1d2","Type":"ContainerDied","Data":"dbb6b565f4b8450c20532faa23b659d55b34c927b1a8f337f5844bf2f0e38e3c"} Feb 23 00:09:54 crc kubenswrapper[4735]: I0223 00:09:54.742040 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbb6b565f4b8450c20532faa23b659d55b34c927b1a8f337f5844bf2f0e38e3c" Feb 23 00:09:54 crc kubenswrapper[4735]: I0223 00:09:54.742052 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 23 00:09:54 crc kubenswrapper[4735]: I0223 00:09:54.760364 4735 generic.go:334] "Generic (PLEG): container finished" podID="6b4dd4f4-5ead-4f6f-a993-34effb6df863" containerID="4bf02ec4c27f762900c43e5d6f61665a6dd764f6f5045271f71b90a55f54e852" exitCode=0 Feb 23 00:09:54 crc kubenswrapper[4735]: I0223 00:09:54.760503 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5xl8z" event={"ID":"6b4dd4f4-5ead-4f6f-a993-34effb6df863","Type":"ContainerDied","Data":"4bf02ec4c27f762900c43e5d6f61665a6dd764f6f5045271f71b90a55f54e852"} Feb 23 00:09:54 crc kubenswrapper[4735]: I0223 00:09:54.761012 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/111be185-20ca-4a4d-a9d9-ead24f0d7c4b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"111be185-20ca-4a4d-a9d9-ead24f0d7c4b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 00:09:54 crc kubenswrapper[4735]: I0223 00:09:54.782742 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 00:09:55 crc kubenswrapper[4735]: I0223 00:09:55.370457 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 23 00:09:55 crc kubenswrapper[4735]: W0223 00:09:55.394099 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod111be185_20ca_4a4d_a9d9_ead24f0d7c4b.slice/crio-50cba8ca06e726fd25d67d1fa632298e1a5c75466a379f581f5645da23a65d81 WatchSource:0}: Error finding container 50cba8ca06e726fd25d67d1fa632298e1a5c75466a379f581f5645da23a65d81: Status 404 returned error can't find the container with id 50cba8ca06e726fd25d67d1fa632298e1a5c75466a379f581f5645da23a65d81 Feb 23 00:09:55 crc kubenswrapper[4735]: I0223 00:09:55.698501 4735 patch_prober.go:28] interesting pod/router-default-5444994796-f77lm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 00:09:55 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Feb 23 00:09:55 crc kubenswrapper[4735]: [+]process-running ok Feb 23 00:09:55 crc kubenswrapper[4735]: healthz check failed Feb 23 00:09:55 crc kubenswrapper[4735]: I0223 00:09:55.699257 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f77lm" podUID="bfb40dcb-d460-44f4-a461-fc423b7aed52" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 00:09:55 crc kubenswrapper[4735]: I0223 00:09:55.772538 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"111be185-20ca-4a4d-a9d9-ead24f0d7c4b","Type":"ContainerStarted","Data":"50cba8ca06e726fd25d67d1fa632298e1a5c75466a379f581f5645da23a65d81"} Feb 23 00:09:56 crc kubenswrapper[4735]: I0223 00:09:56.091139 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-5w4w6" Feb 23 00:09:56 crc kubenswrapper[4735]: I0223 00:09:56.702297 4735 patch_prober.go:28] interesting pod/router-default-5444994796-f77lm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 23 00:09:56 crc kubenswrapper[4735]: [-]has-synced failed: reason withheld Feb 23 00:09:56 crc kubenswrapper[4735]: [+]process-running ok Feb 23 00:09:56 crc kubenswrapper[4735]: healthz check failed Feb 23 00:09:56 crc kubenswrapper[4735]: I0223 00:09:56.702945 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-f77lm" podUID="bfb40dcb-d460-44f4-a461-fc423b7aed52" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 23 00:09:56 crc kubenswrapper[4735]: I0223 00:09:56.798769 4735 generic.go:334] "Generic (PLEG): container finished" podID="111be185-20ca-4a4d-a9d9-ead24f0d7c4b" containerID="8516c73fe4e28fbba7d0be8721c6ce64f468b2971397cc547c6d84ca5ccf9b78" exitCode=0 Feb 23 00:09:56 crc kubenswrapper[4735]: I0223 00:09:56.798837 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"111be185-20ca-4a4d-a9d9-ead24f0d7c4b","Type":"ContainerDied","Data":"8516c73fe4e28fbba7d0be8721c6ce64f468b2971397cc547c6d84ca5ccf9b78"} Feb 23 00:09:57 crc kubenswrapper[4735]: I0223 00:09:57.699587 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-f77lm" Feb 23 00:09:57 crc kubenswrapper[4735]: I0223 00:09:57.706945 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-f77lm" Feb 23 00:09:59 crc kubenswrapper[4735]: I0223 00:09:59.997347 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:10:02 crc kubenswrapper[4735]: I0223 00:10:02.869222 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-dsqjf" Feb 23 00:10:02 crc kubenswrapper[4735]: I0223 00:10:02.929285 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-g5xff" Feb 23 00:10:02 crc kubenswrapper[4735]: I0223 00:10:02.939571 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-g5xff" Feb 23 00:10:06 crc kubenswrapper[4735]: I0223 00:10:06.106909 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tjkjf"] Feb 23 00:10:06 crc kubenswrapper[4735]: I0223 00:10:06.109368 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-tjkjf" podUID="299ad63a-4584-4079-81c4-b8645326d3d0" containerName="controller-manager" containerID="cri-o://25beb31bd86154f73322793c0ede04c047f19979f1ad9a63928c3fa015dc3c62" gracePeriod=30 Feb 23 00:10:06 crc kubenswrapper[4735]: I0223 00:10:06.114863 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6xqc"] Feb 23 00:10:06 crc kubenswrapper[4735]: I0223 00:10:06.115078 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6xqc" podUID="25de1778-6e9d-4158-9aa4-02e1607d7f45" containerName="route-controller-manager" containerID="cri-o://2703a3cb67f5d683979d8142eb05624f0b79a1b83a77baa096444c226b85cd39" gracePeriod=30 Feb 23 00:10:06 crc kubenswrapper[4735]: I0223 00:10:06.885141 4735 generic.go:334] "Generic (PLEG): container finished" podID="25de1778-6e9d-4158-9aa4-02e1607d7f45" containerID="2703a3cb67f5d683979d8142eb05624f0b79a1b83a77baa096444c226b85cd39" exitCode=0 Feb 23 00:10:06 crc kubenswrapper[4735]: I0223 00:10:06.885199 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6xqc" event={"ID":"25de1778-6e9d-4158-9aa4-02e1607d7f45","Type":"ContainerDied","Data":"2703a3cb67f5d683979d8142eb05624f0b79a1b83a77baa096444c226b85cd39"} Feb 23 00:10:06 crc kubenswrapper[4735]: I0223 00:10:06.886494 4735 generic.go:334] "Generic (PLEG): container finished" podID="299ad63a-4584-4079-81c4-b8645326d3d0" containerID="25beb31bd86154f73322793c0ede04c047f19979f1ad9a63928c3fa015dc3c62" exitCode=0 Feb 23 00:10:06 crc kubenswrapper[4735]: I0223 00:10:06.886520 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tjkjf" event={"ID":"299ad63a-4584-4079-81c4-b8645326d3d0","Type":"ContainerDied","Data":"25beb31bd86154f73322793c0ede04c047f19979f1ad9a63928c3fa015dc3c62"} Feb 23 00:10:08 crc kubenswrapper[4735]: I0223 00:10:08.452846 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 00:10:08 crc kubenswrapper[4735]: I0223 00:10:08.557458 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/111be185-20ca-4a4d-a9d9-ead24f0d7c4b-kube-api-access\") pod \"111be185-20ca-4a4d-a9d9-ead24f0d7c4b\" (UID: \"111be185-20ca-4a4d-a9d9-ead24f0d7c4b\") " Feb 23 00:10:08 crc kubenswrapper[4735]: I0223 00:10:08.557537 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/111be185-20ca-4a4d-a9d9-ead24f0d7c4b-kubelet-dir\") pod \"111be185-20ca-4a4d-a9d9-ead24f0d7c4b\" (UID: \"111be185-20ca-4a4d-a9d9-ead24f0d7c4b\") " Feb 23 00:10:08 crc kubenswrapper[4735]: I0223 00:10:08.557965 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/111be185-20ca-4a4d-a9d9-ead24f0d7c4b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "111be185-20ca-4a4d-a9d9-ead24f0d7c4b" (UID: "111be185-20ca-4a4d-a9d9-ead24f0d7c4b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 00:10:08 crc kubenswrapper[4735]: I0223 00:10:08.558094 4735 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/111be185-20ca-4a4d-a9d9-ead24f0d7c4b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:08 crc kubenswrapper[4735]: I0223 00:10:08.565933 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/111be185-20ca-4a4d-a9d9-ead24f0d7c4b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "111be185-20ca-4a4d-a9d9-ead24f0d7c4b" (UID: "111be185-20ca-4a4d-a9d9-ead24f0d7c4b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:10:08 crc kubenswrapper[4735]: I0223 00:10:08.659288 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/111be185-20ca-4a4d-a9d9-ead24f0d7c4b-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:08 crc kubenswrapper[4735]: I0223 00:10:08.933638 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"111be185-20ca-4a4d-a9d9-ead24f0d7c4b","Type":"ContainerDied","Data":"50cba8ca06e726fd25d67d1fa632298e1a5c75466a379f581f5645da23a65d81"} Feb 23 00:10:08 crc kubenswrapper[4735]: I0223 00:10:08.934039 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50cba8ca06e726fd25d67d1fa632298e1a5c75466a379f581f5645da23a65d81" Feb 23 00:10:08 crc kubenswrapper[4735]: I0223 00:10:08.933755 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 23 00:10:10 crc kubenswrapper[4735]: I0223 00:10:10.847881 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:10:11 crc kubenswrapper[4735]: I0223 00:10:11.512413 4735 patch_prober.go:28] interesting pod/machine-config-daemon-blmnv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 00:10:11 crc kubenswrapper[4735]: I0223 00:10:11.512466 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" podUID="1cba474f-2d55-4a07-969f-25e2817a06d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 00:10:12 crc kubenswrapper[4735]: I0223 00:10:12.010051 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b542cb9e-35cc-44d9-a850-c41887636c4c-metrics-certs\") pod \"network-metrics-daemon-bdqfd\" (UID: \"b542cb9e-35cc-44d9-a850-c41887636c4c\") " pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:10:12 crc kubenswrapper[4735]: I0223 00:10:12.029730 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b542cb9e-35cc-44d9-a850-c41887636c4c-metrics-certs\") pod \"network-metrics-daemon-bdqfd\" (UID: \"b542cb9e-35cc-44d9-a850-c41887636c4c\") " pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:10:12 crc kubenswrapper[4735]: I0223 00:10:12.306487 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bdqfd" Feb 23 00:10:13 crc kubenswrapper[4735]: I0223 00:10:13.994305 4735 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-c6xqc container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 00:10:14 crc kubenswrapper[4735]: I0223 00:10:13.994641 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6xqc" podUID="25de1778-6e9d-4158-9aa4-02e1607d7f45" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 00:10:14 crc kubenswrapper[4735]: I0223 00:10:13.994375 4735 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-tjkjf container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 23 00:10:14 crc kubenswrapper[4735]: I0223 00:10:13.994745 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-tjkjf" podUID="299ad63a-4584-4079-81c4-b8645326d3d0" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.031794 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tjkjf" Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.066183 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5d9b8d9bd5-xqdx5"] Feb 23 00:10:15 crc kubenswrapper[4735]: E0223 00:10:15.066365 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="299ad63a-4584-4079-81c4-b8645326d3d0" containerName="controller-manager" Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.066375 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="299ad63a-4584-4079-81c4-b8645326d3d0" containerName="controller-manager" Feb 23 00:10:15 crc kubenswrapper[4735]: E0223 00:10:15.066387 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="111be185-20ca-4a4d-a9d9-ead24f0d7c4b" containerName="pruner" Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.066393 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="111be185-20ca-4a4d-a9d9-ead24f0d7c4b" containerName="pruner" Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.066493 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="299ad63a-4584-4079-81c4-b8645326d3d0" containerName="controller-manager" Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.066510 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="111be185-20ca-4a4d-a9d9-ead24f0d7c4b" containerName="pruner" Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.066831 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d9b8d9bd5-xqdx5" Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.078773 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d9b8d9bd5-xqdx5"] Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.151124 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/299ad63a-4584-4079-81c4-b8645326d3d0-serving-cert\") pod \"299ad63a-4584-4079-81c4-b8645326d3d0\" (UID: \"299ad63a-4584-4079-81c4-b8645326d3d0\") " Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.151197 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/299ad63a-4584-4079-81c4-b8645326d3d0-proxy-ca-bundles\") pod \"299ad63a-4584-4079-81c4-b8645326d3d0\" (UID: \"299ad63a-4584-4079-81c4-b8645326d3d0\") " Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.151319 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/299ad63a-4584-4079-81c4-b8645326d3d0-config\") pod \"299ad63a-4584-4079-81c4-b8645326d3d0\" (UID: \"299ad63a-4584-4079-81c4-b8645326d3d0\") " Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.151346 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktb5p\" (UniqueName: \"kubernetes.io/projected/299ad63a-4584-4079-81c4-b8645326d3d0-kube-api-access-ktb5p\") pod \"299ad63a-4584-4079-81c4-b8645326d3d0\" (UID: \"299ad63a-4584-4079-81c4-b8645326d3d0\") " Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.151396 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/299ad63a-4584-4079-81c4-b8645326d3d0-client-ca\") pod \"299ad63a-4584-4079-81c4-b8645326d3d0\" (UID: \"299ad63a-4584-4079-81c4-b8645326d3d0\") " Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.151550 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgfqw\" (UniqueName: \"kubernetes.io/projected/783cdbda-0825-4dc7-9a0b-d96f81ae9c01-kube-api-access-qgfqw\") pod \"controller-manager-5d9b8d9bd5-xqdx5\" (UID: \"783cdbda-0825-4dc7-9a0b-d96f81ae9c01\") " pod="openshift-controller-manager/controller-manager-5d9b8d9bd5-xqdx5" Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.151585 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/783cdbda-0825-4dc7-9a0b-d96f81ae9c01-client-ca\") pod \"controller-manager-5d9b8d9bd5-xqdx5\" (UID: \"783cdbda-0825-4dc7-9a0b-d96f81ae9c01\") " pod="openshift-controller-manager/controller-manager-5d9b8d9bd5-xqdx5" Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.151697 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/783cdbda-0825-4dc7-9a0b-d96f81ae9c01-serving-cert\") pod \"controller-manager-5d9b8d9bd5-xqdx5\" (UID: \"783cdbda-0825-4dc7-9a0b-d96f81ae9c01\") " pod="openshift-controller-manager/controller-manager-5d9b8d9bd5-xqdx5" Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.151772 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/783cdbda-0825-4dc7-9a0b-d96f81ae9c01-proxy-ca-bundles\") pod \"controller-manager-5d9b8d9bd5-xqdx5\" (UID: \"783cdbda-0825-4dc7-9a0b-d96f81ae9c01\") " pod="openshift-controller-manager/controller-manager-5d9b8d9bd5-xqdx5" Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.151829 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/783cdbda-0825-4dc7-9a0b-d96f81ae9c01-config\") pod \"controller-manager-5d9b8d9bd5-xqdx5\" (UID: \"783cdbda-0825-4dc7-9a0b-d96f81ae9c01\") " pod="openshift-controller-manager/controller-manager-5d9b8d9bd5-xqdx5" Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.152018 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/299ad63a-4584-4079-81c4-b8645326d3d0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "299ad63a-4584-4079-81c4-b8645326d3d0" (UID: "299ad63a-4584-4079-81c4-b8645326d3d0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.152403 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/299ad63a-4584-4079-81c4-b8645326d3d0-config" (OuterVolumeSpecName: "config") pod "299ad63a-4584-4079-81c4-b8645326d3d0" (UID: "299ad63a-4584-4079-81c4-b8645326d3d0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.152413 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/299ad63a-4584-4079-81c4-b8645326d3d0-client-ca" (OuterVolumeSpecName: "client-ca") pod "299ad63a-4584-4079-81c4-b8645326d3d0" (UID: "299ad63a-4584-4079-81c4-b8645326d3d0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.156936 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/299ad63a-4584-4079-81c4-b8645326d3d0-kube-api-access-ktb5p" (OuterVolumeSpecName: "kube-api-access-ktb5p") pod "299ad63a-4584-4079-81c4-b8645326d3d0" (UID: "299ad63a-4584-4079-81c4-b8645326d3d0"). InnerVolumeSpecName "kube-api-access-ktb5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.157567 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/299ad63a-4584-4079-81c4-b8645326d3d0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "299ad63a-4584-4079-81c4-b8645326d3d0" (UID: "299ad63a-4584-4079-81c4-b8645326d3d0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.252772 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgfqw\" (UniqueName: \"kubernetes.io/projected/783cdbda-0825-4dc7-9a0b-d96f81ae9c01-kube-api-access-qgfqw\") pod \"controller-manager-5d9b8d9bd5-xqdx5\" (UID: \"783cdbda-0825-4dc7-9a0b-d96f81ae9c01\") " pod="openshift-controller-manager/controller-manager-5d9b8d9bd5-xqdx5" Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.252834 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/783cdbda-0825-4dc7-9a0b-d96f81ae9c01-client-ca\") pod \"controller-manager-5d9b8d9bd5-xqdx5\" (UID: \"783cdbda-0825-4dc7-9a0b-d96f81ae9c01\") " pod="openshift-controller-manager/controller-manager-5d9b8d9bd5-xqdx5" Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.252903 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/783cdbda-0825-4dc7-9a0b-d96f81ae9c01-serving-cert\") pod \"controller-manager-5d9b8d9bd5-xqdx5\" (UID: \"783cdbda-0825-4dc7-9a0b-d96f81ae9c01\") " pod="openshift-controller-manager/controller-manager-5d9b8d9bd5-xqdx5" Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.252919 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/783cdbda-0825-4dc7-9a0b-d96f81ae9c01-proxy-ca-bundles\") pod \"controller-manager-5d9b8d9bd5-xqdx5\" (UID: \"783cdbda-0825-4dc7-9a0b-d96f81ae9c01\") " pod="openshift-controller-manager/controller-manager-5d9b8d9bd5-xqdx5" Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.252952 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/783cdbda-0825-4dc7-9a0b-d96f81ae9c01-config\") pod \"controller-manager-5d9b8d9bd5-xqdx5\" (UID: \"783cdbda-0825-4dc7-9a0b-d96f81ae9c01\") " pod="openshift-controller-manager/controller-manager-5d9b8d9bd5-xqdx5" Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.252995 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/299ad63a-4584-4079-81c4-b8645326d3d0-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.253005 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktb5p\" (UniqueName: \"kubernetes.io/projected/299ad63a-4584-4079-81c4-b8645326d3d0-kube-api-access-ktb5p\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.253014 4735 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/299ad63a-4584-4079-81c4-b8645326d3d0-client-ca\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.253023 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/299ad63a-4584-4079-81c4-b8645326d3d0-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.253032 4735 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/299ad63a-4584-4079-81c4-b8645326d3d0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.255523 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/783cdbda-0825-4dc7-9a0b-d96f81ae9c01-proxy-ca-bundles\") pod \"controller-manager-5d9b8d9bd5-xqdx5\" (UID: \"783cdbda-0825-4dc7-9a0b-d96f81ae9c01\") " pod="openshift-controller-manager/controller-manager-5d9b8d9bd5-xqdx5" Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.256036 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/783cdbda-0825-4dc7-9a0b-d96f81ae9c01-config\") pod \"controller-manager-5d9b8d9bd5-xqdx5\" (UID: \"783cdbda-0825-4dc7-9a0b-d96f81ae9c01\") " pod="openshift-controller-manager/controller-manager-5d9b8d9bd5-xqdx5" Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.256213 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/783cdbda-0825-4dc7-9a0b-d96f81ae9c01-client-ca\") pod \"controller-manager-5d9b8d9bd5-xqdx5\" (UID: \"783cdbda-0825-4dc7-9a0b-d96f81ae9c01\") " pod="openshift-controller-manager/controller-manager-5d9b8d9bd5-xqdx5" Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.265899 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/783cdbda-0825-4dc7-9a0b-d96f81ae9c01-serving-cert\") pod \"controller-manager-5d9b8d9bd5-xqdx5\" (UID: \"783cdbda-0825-4dc7-9a0b-d96f81ae9c01\") " pod="openshift-controller-manager/controller-manager-5d9b8d9bd5-xqdx5" Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.270182 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgfqw\" (UniqueName: \"kubernetes.io/projected/783cdbda-0825-4dc7-9a0b-d96f81ae9c01-kube-api-access-qgfqw\") pod \"controller-manager-5d9b8d9bd5-xqdx5\" (UID: \"783cdbda-0825-4dc7-9a0b-d96f81ae9c01\") " pod="openshift-controller-manager/controller-manager-5d9b8d9bd5-xqdx5" Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.420533 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d9b8d9bd5-xqdx5" Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.973728 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tjkjf" event={"ID":"299ad63a-4584-4079-81c4-b8645326d3d0","Type":"ContainerDied","Data":"6af1468a116382ceeef9fe83bb36864e6e9681d1dc040b15439acb88988d38b3"} Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.973775 4735 scope.go:117] "RemoveContainer" containerID="25beb31bd86154f73322793c0ede04c047f19979f1ad9a63928c3fa015dc3c62" Feb 23 00:10:15 crc kubenswrapper[4735]: I0223 00:10:15.973873 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tjkjf" Feb 23 00:10:16 crc kubenswrapper[4735]: I0223 00:10:16.016931 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tjkjf"] Feb 23 00:10:16 crc kubenswrapper[4735]: I0223 00:10:16.025672 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tjkjf"] Feb 23 00:10:16 crc kubenswrapper[4735]: I0223 00:10:16.280003 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="299ad63a-4584-4079-81c4-b8645326d3d0" path="/var/lib/kubelet/pods/299ad63a-4584-4079-81c4-b8645326d3d0/volumes" Feb 23 00:10:16 crc kubenswrapper[4735]: I0223 00:10:16.419087 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6xqc" Feb 23 00:10:16 crc kubenswrapper[4735]: I0223 00:10:16.477288 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25de1778-6e9d-4158-9aa4-02e1607d7f45-config\") pod \"25de1778-6e9d-4158-9aa4-02e1607d7f45\" (UID: \"25de1778-6e9d-4158-9aa4-02e1607d7f45\") " Feb 23 00:10:16 crc kubenswrapper[4735]: I0223 00:10:16.477347 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25de1778-6e9d-4158-9aa4-02e1607d7f45-serving-cert\") pod \"25de1778-6e9d-4158-9aa4-02e1607d7f45\" (UID: \"25de1778-6e9d-4158-9aa4-02e1607d7f45\") " Feb 23 00:10:16 crc kubenswrapper[4735]: I0223 00:10:16.477382 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9lqk\" (UniqueName: \"kubernetes.io/projected/25de1778-6e9d-4158-9aa4-02e1607d7f45-kube-api-access-z9lqk\") pod \"25de1778-6e9d-4158-9aa4-02e1607d7f45\" (UID: \"25de1778-6e9d-4158-9aa4-02e1607d7f45\") " Feb 23 00:10:16 crc kubenswrapper[4735]: I0223 00:10:16.477399 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25de1778-6e9d-4158-9aa4-02e1607d7f45-client-ca\") pod \"25de1778-6e9d-4158-9aa4-02e1607d7f45\" (UID: \"25de1778-6e9d-4158-9aa4-02e1607d7f45\") " Feb 23 00:10:16 crc kubenswrapper[4735]: I0223 00:10:16.478241 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25de1778-6e9d-4158-9aa4-02e1607d7f45-client-ca" (OuterVolumeSpecName: "client-ca") pod "25de1778-6e9d-4158-9aa4-02e1607d7f45" (UID: "25de1778-6e9d-4158-9aa4-02e1607d7f45"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:10:16 crc kubenswrapper[4735]: I0223 00:10:16.478282 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25de1778-6e9d-4158-9aa4-02e1607d7f45-config" (OuterVolumeSpecName: "config") pod "25de1778-6e9d-4158-9aa4-02e1607d7f45" (UID: "25de1778-6e9d-4158-9aa4-02e1607d7f45"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:10:16 crc kubenswrapper[4735]: I0223 00:10:16.482800 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25de1778-6e9d-4158-9aa4-02e1607d7f45-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "25de1778-6e9d-4158-9aa4-02e1607d7f45" (UID: "25de1778-6e9d-4158-9aa4-02e1607d7f45"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:10:16 crc kubenswrapper[4735]: I0223 00:10:16.483542 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25de1778-6e9d-4158-9aa4-02e1607d7f45-kube-api-access-z9lqk" (OuterVolumeSpecName: "kube-api-access-z9lqk") pod "25de1778-6e9d-4158-9aa4-02e1607d7f45" (UID: "25de1778-6e9d-4158-9aa4-02e1607d7f45"). InnerVolumeSpecName "kube-api-access-z9lqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:10:16 crc kubenswrapper[4735]: I0223 00:10:16.578297 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25de1778-6e9d-4158-9aa4-02e1607d7f45-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:16 crc kubenswrapper[4735]: I0223 00:10:16.578323 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25de1778-6e9d-4158-9aa4-02e1607d7f45-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:16 crc kubenswrapper[4735]: I0223 00:10:16.578332 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9lqk\" (UniqueName: \"kubernetes.io/projected/25de1778-6e9d-4158-9aa4-02e1607d7f45-kube-api-access-z9lqk\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:16 crc kubenswrapper[4735]: I0223 00:10:16.578342 4735 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25de1778-6e9d-4158-9aa4-02e1607d7f45-client-ca\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:16 crc kubenswrapper[4735]: I0223 00:10:16.978109 4735 generic.go:334] "Generic (PLEG): container finished" podID="6cb942f6-2bc5-4662-929c-849ec29baddc" containerID="bb9ab9016eedde4179bdbf86f86dfceb4cf2b8ddb2635d00305f46180285056f" exitCode=0 Feb 23 00:10:16 crc kubenswrapper[4735]: I0223 00:10:16.978167 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29530080-8vgj6" event={"ID":"6cb942f6-2bc5-4662-929c-849ec29baddc","Type":"ContainerDied","Data":"bb9ab9016eedde4179bdbf86f86dfceb4cf2b8ddb2635d00305f46180285056f"} Feb 23 00:10:16 crc kubenswrapper[4735]: I0223 00:10:16.979303 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6xqc" event={"ID":"25de1778-6e9d-4158-9aa4-02e1607d7f45","Type":"ContainerDied","Data":"f7fd8bc945c60242fe77809b02a48d625e071f3f0ae869930536b2e3fa0da68f"} Feb 23 00:10:16 crc kubenswrapper[4735]: I0223 00:10:16.979358 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6xqc" Feb 23 00:10:17 crc kubenswrapper[4735]: I0223 00:10:17.001540 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6xqc"] Feb 23 00:10:17 crc kubenswrapper[4735]: I0223 00:10:17.005411 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-c6xqc"] Feb 23 00:10:18 crc kubenswrapper[4735]: I0223 00:10:18.279575 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25de1778-6e9d-4158-9aa4-02e1607d7f45" path="/var/lib/kubelet/pods/25de1778-6e9d-4158-9aa4-02e1607d7f45/volumes" Feb 23 00:10:19 crc kubenswrapper[4735]: I0223 00:10:19.820248 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78f64778cd-svhvh"] Feb 23 00:10:19 crc kubenswrapper[4735]: E0223 00:10:19.820487 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25de1778-6e9d-4158-9aa4-02e1607d7f45" containerName="route-controller-manager" Feb 23 00:10:19 crc kubenswrapper[4735]: I0223 00:10:19.820502 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="25de1778-6e9d-4158-9aa4-02e1607d7f45" containerName="route-controller-manager" Feb 23 00:10:19 crc kubenswrapper[4735]: I0223 00:10:19.820686 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="25de1778-6e9d-4158-9aa4-02e1607d7f45" containerName="route-controller-manager" Feb 23 00:10:19 crc kubenswrapper[4735]: I0223 00:10:19.821220 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78f64778cd-svhvh" Feb 23 00:10:19 crc kubenswrapper[4735]: I0223 00:10:19.824698 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 23 00:10:19 crc kubenswrapper[4735]: I0223 00:10:19.824728 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 23 00:10:19 crc kubenswrapper[4735]: I0223 00:10:19.824788 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 23 00:10:19 crc kubenswrapper[4735]: I0223 00:10:19.825068 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 23 00:10:19 crc kubenswrapper[4735]: I0223 00:10:19.825117 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 23 00:10:19 crc kubenswrapper[4735]: I0223 00:10:19.826774 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 23 00:10:19 crc kubenswrapper[4735]: I0223 00:10:19.828329 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78f64778cd-svhvh"] Feb 23 00:10:19 crc kubenswrapper[4735]: E0223 00:10:19.896174 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 23 00:10:19 crc kubenswrapper[4735]: E0223 00:10:19.896380 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fzbsx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-mvwmg_openshift-marketplace(703af3dd-f895-4e96-991a-7e8a405bb03e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 23 00:10:19 crc kubenswrapper[4735]: E0223 00:10:19.898012 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-mvwmg" podUID="703af3dd-f895-4e96-991a-7e8a405bb03e" Feb 23 00:10:19 crc kubenswrapper[4735]: I0223 00:10:19.920357 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63bc3dc7-0311-4933-ab26-8420f17c823b-serving-cert\") pod \"route-controller-manager-78f64778cd-svhvh\" (UID: \"63bc3dc7-0311-4933-ab26-8420f17c823b\") " pod="openshift-route-controller-manager/route-controller-manager-78f64778cd-svhvh" Feb 23 00:10:19 crc kubenswrapper[4735]: I0223 00:10:19.920406 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63bc3dc7-0311-4933-ab26-8420f17c823b-client-ca\") pod \"route-controller-manager-78f64778cd-svhvh\" (UID: \"63bc3dc7-0311-4933-ab26-8420f17c823b\") " pod="openshift-route-controller-manager/route-controller-manager-78f64778cd-svhvh" Feb 23 00:10:19 crc kubenswrapper[4735]: I0223 00:10:19.920439 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63bc3dc7-0311-4933-ab26-8420f17c823b-config\") pod \"route-controller-manager-78f64778cd-svhvh\" (UID: \"63bc3dc7-0311-4933-ab26-8420f17c823b\") " pod="openshift-route-controller-manager/route-controller-manager-78f64778cd-svhvh" Feb 23 00:10:19 crc kubenswrapper[4735]: I0223 00:10:19.920503 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r27nm\" (UniqueName: \"kubernetes.io/projected/63bc3dc7-0311-4933-ab26-8420f17c823b-kube-api-access-r27nm\") pod \"route-controller-manager-78f64778cd-svhvh\" (UID: \"63bc3dc7-0311-4933-ab26-8420f17c823b\") " pod="openshift-route-controller-manager/route-controller-manager-78f64778cd-svhvh" Feb 23 00:10:19 crc kubenswrapper[4735]: I0223 00:10:19.936679 4735 scope.go:117] "RemoveContainer" containerID="2703a3cb67f5d683979d8142eb05624f0b79a1b83a77baa096444c226b85cd39" Feb 23 00:10:19 crc kubenswrapper[4735]: I0223 00:10:19.951298 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29530080-8vgj6" Feb 23 00:10:19 crc kubenswrapper[4735]: E0223 00:10:19.955991 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 23 00:10:19 crc kubenswrapper[4735]: E0223 00:10:19.956143 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-25n9n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-74kxk_openshift-marketplace(0e33bc37-4c54-4f50-95b7-bd2cf2da176e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 23 00:10:19 crc kubenswrapper[4735]: E0223 00:10:19.958589 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-74kxk" podUID="0e33bc37-4c54-4f50-95b7-bd2cf2da176e" Feb 23 00:10:19 crc kubenswrapper[4735]: E0223 00:10:19.991398 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 23 00:10:19 crc kubenswrapper[4735]: E0223 00:10:19.991556 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kntm8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-5xl8z_openshift-marketplace(6b4dd4f4-5ead-4f6f-a993-34effb6df863): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 23 00:10:19 crc kubenswrapper[4735]: E0223 00:10:19.992980 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-5xl8z" podUID="6b4dd4f4-5ead-4f6f-a993-34effb6df863" Feb 23 00:10:19 crc kubenswrapper[4735]: I0223 00:10:19.999088 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29530080-8vgj6" Feb 23 00:10:19 crc kubenswrapper[4735]: I0223 00:10:19.999076 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29530080-8vgj6" event={"ID":"6cb942f6-2bc5-4662-929c-849ec29baddc","Type":"ContainerDied","Data":"0cd610a4df48fd174a26ab2f61b4eb6765cee4325326738785ad068fab9e5899"} Feb 23 00:10:19 crc kubenswrapper[4735]: I0223 00:10:19.999134 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0cd610a4df48fd174a26ab2f61b4eb6765cee4325326738785ad068fab9e5899" Feb 23 00:10:20 crc kubenswrapper[4735]: E0223 00:10:20.010505 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-mvwmg" podUID="703af3dd-f895-4e96-991a-7e8a405bb03e" Feb 23 00:10:20 crc kubenswrapper[4735]: E0223 00:10:20.018423 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-74kxk" podUID="0e33bc37-4c54-4f50-95b7-bd2cf2da176e" Feb 23 00:10:20 crc kubenswrapper[4735]: I0223 00:10:20.022556 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6cb942f6-2bc5-4662-929c-849ec29baddc-serviceca\") pod \"6cb942f6-2bc5-4662-929c-849ec29baddc\" (UID: \"6cb942f6-2bc5-4662-929c-849ec29baddc\") " Feb 23 00:10:20 crc kubenswrapper[4735]: I0223 00:10:20.022674 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj2xf\" (UniqueName: \"kubernetes.io/projected/6cb942f6-2bc5-4662-929c-849ec29baddc-kube-api-access-pj2xf\") pod \"6cb942f6-2bc5-4662-929c-849ec29baddc\" (UID: \"6cb942f6-2bc5-4662-929c-849ec29baddc\") " Feb 23 00:10:20 crc kubenswrapper[4735]: I0223 00:10:20.022950 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63bc3dc7-0311-4933-ab26-8420f17c823b-serving-cert\") pod \"route-controller-manager-78f64778cd-svhvh\" (UID: \"63bc3dc7-0311-4933-ab26-8420f17c823b\") " pod="openshift-route-controller-manager/route-controller-manager-78f64778cd-svhvh" Feb 23 00:10:20 crc kubenswrapper[4735]: I0223 00:10:20.022984 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63bc3dc7-0311-4933-ab26-8420f17c823b-client-ca\") pod \"route-controller-manager-78f64778cd-svhvh\" (UID: \"63bc3dc7-0311-4933-ab26-8420f17c823b\") " pod="openshift-route-controller-manager/route-controller-manager-78f64778cd-svhvh" Feb 23 00:10:20 crc kubenswrapper[4735]: I0223 00:10:20.023048 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63bc3dc7-0311-4933-ab26-8420f17c823b-config\") pod \"route-controller-manager-78f64778cd-svhvh\" (UID: \"63bc3dc7-0311-4933-ab26-8420f17c823b\") " pod="openshift-route-controller-manager/route-controller-manager-78f64778cd-svhvh" Feb 23 00:10:20 crc kubenswrapper[4735]: I0223 00:10:20.023258 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r27nm\" (UniqueName: \"kubernetes.io/projected/63bc3dc7-0311-4933-ab26-8420f17c823b-kube-api-access-r27nm\") pod \"route-controller-manager-78f64778cd-svhvh\" (UID: \"63bc3dc7-0311-4933-ab26-8420f17c823b\") " pod="openshift-route-controller-manager/route-controller-manager-78f64778cd-svhvh" Feb 23 00:10:20 crc kubenswrapper[4735]: I0223 00:10:20.025174 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cb942f6-2bc5-4662-929c-849ec29baddc-serviceca" (OuterVolumeSpecName: "serviceca") pod "6cb942f6-2bc5-4662-929c-849ec29baddc" (UID: "6cb942f6-2bc5-4662-929c-849ec29baddc"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:10:20 crc kubenswrapper[4735]: I0223 00:10:20.027276 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63bc3dc7-0311-4933-ab26-8420f17c823b-client-ca\") pod \"route-controller-manager-78f64778cd-svhvh\" (UID: \"63bc3dc7-0311-4933-ab26-8420f17c823b\") " pod="openshift-route-controller-manager/route-controller-manager-78f64778cd-svhvh" Feb 23 00:10:20 crc kubenswrapper[4735]: I0223 00:10:20.027841 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63bc3dc7-0311-4933-ab26-8420f17c823b-config\") pod \"route-controller-manager-78f64778cd-svhvh\" (UID: \"63bc3dc7-0311-4933-ab26-8420f17c823b\") " pod="openshift-route-controller-manager/route-controller-manager-78f64778cd-svhvh" Feb 23 00:10:20 crc kubenswrapper[4735]: I0223 00:10:20.046158 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cb942f6-2bc5-4662-929c-849ec29baddc-kube-api-access-pj2xf" (OuterVolumeSpecName: "kube-api-access-pj2xf") pod "6cb942f6-2bc5-4662-929c-849ec29baddc" (UID: "6cb942f6-2bc5-4662-929c-849ec29baddc"). InnerVolumeSpecName "kube-api-access-pj2xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:10:20 crc kubenswrapper[4735]: I0223 00:10:20.054642 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r27nm\" (UniqueName: \"kubernetes.io/projected/63bc3dc7-0311-4933-ab26-8420f17c823b-kube-api-access-r27nm\") pod \"route-controller-manager-78f64778cd-svhvh\" (UID: \"63bc3dc7-0311-4933-ab26-8420f17c823b\") " pod="openshift-route-controller-manager/route-controller-manager-78f64778cd-svhvh" Feb 23 00:10:20 crc kubenswrapper[4735]: I0223 00:10:20.055487 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63bc3dc7-0311-4933-ab26-8420f17c823b-serving-cert\") pod \"route-controller-manager-78f64778cd-svhvh\" (UID: \"63bc3dc7-0311-4933-ab26-8420f17c823b\") " pod="openshift-route-controller-manager/route-controller-manager-78f64778cd-svhvh" Feb 23 00:10:20 crc kubenswrapper[4735]: I0223 00:10:20.124998 4735 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6cb942f6-2bc5-4662-929c-849ec29baddc-serviceca\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:20 crc kubenswrapper[4735]: I0223 00:10:20.125030 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj2xf\" (UniqueName: \"kubernetes.io/projected/6cb942f6-2bc5-4662-929c-849ec29baddc-kube-api-access-pj2xf\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:20 crc kubenswrapper[4735]: I0223 00:10:20.161599 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78f64778cd-svhvh" Feb 23 00:10:20 crc kubenswrapper[4735]: I0223 00:10:20.199336 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bdqfd"] Feb 23 00:10:20 crc kubenswrapper[4735]: W0223 00:10:20.203600 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb542cb9e_35cc_44d9_a850_c41887636c4c.slice/crio-70d9562356830a1df284bfa4097ba0879628a8b2c8772ef6fd0d4a199a55a746 WatchSource:0}: Error finding container 70d9562356830a1df284bfa4097ba0879628a8b2c8772ef6fd0d4a199a55a746: Status 404 returned error can't find the container with id 70d9562356830a1df284bfa4097ba0879628a8b2c8772ef6fd0d4a199a55a746 Feb 23 00:10:20 crc kubenswrapper[4735]: I0223 00:10:20.370110 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d9b8d9bd5-xqdx5"] Feb 23 00:10:20 crc kubenswrapper[4735]: W0223 00:10:20.386164 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod783cdbda_0825_4dc7_9a0b_d96f81ae9c01.slice/crio-588fcccdcba224a44160730adfe972836104bf38f8938f7f90b4691e4b1f6451 WatchSource:0}: Error finding container 588fcccdcba224a44160730adfe972836104bf38f8938f7f90b4691e4b1f6451: Status 404 returned error can't find the container with id 588fcccdcba224a44160730adfe972836104bf38f8938f7f90b4691e4b1f6451 Feb 23 00:10:20 crc kubenswrapper[4735]: I0223 00:10:20.612423 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78f64778cd-svhvh"] Feb 23 00:10:20 crc kubenswrapper[4735]: W0223 00:10:20.624078 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63bc3dc7_0311_4933_ab26_8420f17c823b.slice/crio-5f47dd405ff3645b8548a7cc3b2a9ea9d91f9a020f1e9a4b0c2bc46ddb759627 WatchSource:0}: Error finding container 5f47dd405ff3645b8548a7cc3b2a9ea9d91f9a020f1e9a4b0c2bc46ddb759627: Status 404 returned error can't find the container with id 5f47dd405ff3645b8548a7cc3b2a9ea9d91f9a020f1e9a4b0c2bc46ddb759627 Feb 23 00:10:21 crc kubenswrapper[4735]: I0223 00:10:21.013241 4735 generic.go:334] "Generic (PLEG): container finished" podID="c4ec8027-3683-44b2-a91a-a58f49dedbfd" containerID="847f4ae6264b59f732798f4afd3c15bf21d28f56479df92fd29ec995c8173df9" exitCode=0 Feb 23 00:10:21 crc kubenswrapper[4735]: I0223 00:10:21.013415 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bzbm8" event={"ID":"c4ec8027-3683-44b2-a91a-a58f49dedbfd","Type":"ContainerDied","Data":"847f4ae6264b59f732798f4afd3c15bf21d28f56479df92fd29ec995c8173df9"} Feb 23 00:10:21 crc kubenswrapper[4735]: I0223 00:10:21.021432 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bdqfd" event={"ID":"b542cb9e-35cc-44d9-a850-c41887636c4c","Type":"ContainerStarted","Data":"1acbe00e56874b6f49a099a46f6ff09b2c9ae041a9f1a35be1f0875e60b874e5"} Feb 23 00:10:21 crc kubenswrapper[4735]: I0223 00:10:21.021487 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bdqfd" event={"ID":"b542cb9e-35cc-44d9-a850-c41887636c4c","Type":"ContainerStarted","Data":"70d9562356830a1df284bfa4097ba0879628a8b2c8772ef6fd0d4a199a55a746"} Feb 23 00:10:21 crc kubenswrapper[4735]: I0223 00:10:21.026556 4735 generic.go:334] "Generic (PLEG): container finished" podID="78bcce9f-48e6-49c5-8f3f-75d8e156f6bc" containerID="c1bf000a13b884870070b9ea0bc8f2c7160ef753770485ce7e4289bfc33ef70e" exitCode=0 Feb 23 00:10:21 crc kubenswrapper[4735]: I0223 00:10:21.026612 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gcgz5" event={"ID":"78bcce9f-48e6-49c5-8f3f-75d8e156f6bc","Type":"ContainerDied","Data":"c1bf000a13b884870070b9ea0bc8f2c7160ef753770485ce7e4289bfc33ef70e"} Feb 23 00:10:21 crc kubenswrapper[4735]: I0223 00:10:21.052698 4735 generic.go:334] "Generic (PLEG): container finished" podID="17e19891-1a63-4c0e-abd9-a161f96cf71e" containerID="faea45e90bd87cce8b67e596b7caa61f18b65de0612d204053276442c1412e0d" exitCode=0 Feb 23 00:10:21 crc kubenswrapper[4735]: I0223 00:10:21.052927 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4mrf" event={"ID":"17e19891-1a63-4c0e-abd9-a161f96cf71e","Type":"ContainerDied","Data":"faea45e90bd87cce8b67e596b7caa61f18b65de0612d204053276442c1412e0d"} Feb 23 00:10:21 crc kubenswrapper[4735]: I0223 00:10:21.057979 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78f64778cd-svhvh" event={"ID":"63bc3dc7-0311-4933-ab26-8420f17c823b","Type":"ContainerStarted","Data":"5f47dd405ff3645b8548a7cc3b2a9ea9d91f9a020f1e9a4b0c2bc46ddb759627"} Feb 23 00:10:21 crc kubenswrapper[4735]: I0223 00:10:21.066892 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d9b8d9bd5-xqdx5" event={"ID":"783cdbda-0825-4dc7-9a0b-d96f81ae9c01","Type":"ContainerStarted","Data":"c13d47f55f80230908391f440f2da0a35fab71d92bb04eccdaf06c63b4b74ddc"} Feb 23 00:10:21 crc kubenswrapper[4735]: I0223 00:10:21.066940 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d9b8d9bd5-xqdx5" event={"ID":"783cdbda-0825-4dc7-9a0b-d96f81ae9c01","Type":"ContainerStarted","Data":"588fcccdcba224a44160730adfe972836104bf38f8938f7f90b4691e4b1f6451"} Feb 23 00:10:21 crc kubenswrapper[4735]: I0223 00:10:21.067819 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5d9b8d9bd5-xqdx5" Feb 23 00:10:21 crc kubenswrapper[4735]: I0223 00:10:21.075336 4735 generic.go:334] "Generic (PLEG): container finished" podID="2bac7145-e696-4926-8d9a-de30ef0c6209" containerID="0021d78dd99885ce26ea1bdf7b7df2e60ab9fb2d365126de48066237a4d887de" exitCode=0 Feb 23 00:10:21 crc kubenswrapper[4735]: I0223 00:10:21.075497 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4gt4h" event={"ID":"2bac7145-e696-4926-8d9a-de30ef0c6209","Type":"ContainerDied","Data":"0021d78dd99885ce26ea1bdf7b7df2e60ab9fb2d365126de48066237a4d887de"} Feb 23 00:10:21 crc kubenswrapper[4735]: I0223 00:10:21.085890 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5d9b8d9bd5-xqdx5" Feb 23 00:10:21 crc kubenswrapper[4735]: I0223 00:10:21.096526 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jvnk7"] Feb 23 00:10:21 crc kubenswrapper[4735]: I0223 00:10:21.096714 4735 generic.go:334] "Generic (PLEG): container finished" podID="f444dee5-d7dc-47ed-add6-b3b1148077f2" containerID="4f58202dfbd0973b3c8d29e475d56037577288f1deaa28c554a85c951a7a11c3" exitCode=0 Feb 23 00:10:21 crc kubenswrapper[4735]: I0223 00:10:21.096825 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72lxc" event={"ID":"f444dee5-d7dc-47ed-add6-b3b1148077f2","Type":"ContainerDied","Data":"4f58202dfbd0973b3c8d29e475d56037577288f1deaa28c554a85c951a7a11c3"} Feb 23 00:10:21 crc kubenswrapper[4735]: E0223 00:10:21.105016 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-5xl8z" podUID="6b4dd4f4-5ead-4f6f-a993-34effb6df863" Feb 23 00:10:21 crc kubenswrapper[4735]: I0223 00:10:21.157087 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5d9b8d9bd5-xqdx5" podStartSLOduration=15.157056428 podStartE2EDuration="15.157056428s" podCreationTimestamp="2026-02-23 00:10:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:10:21.157025347 +0000 UTC m=+179.620571318" watchObservedRunningTime="2026-02-23 00:10:21.157056428 +0000 UTC m=+179.620602399" Feb 23 00:10:22 crc kubenswrapper[4735]: I0223 00:10:22.111280 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78f64778cd-svhvh" event={"ID":"63bc3dc7-0311-4933-ab26-8420f17c823b","Type":"ContainerStarted","Data":"e2216f5ad146a11374d2cb40b0127f8a63c40977de49cecce111c884957d6dbf"} Feb 23 00:10:22 crc kubenswrapper[4735]: I0223 00:10:22.111611 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-78f64778cd-svhvh" Feb 23 00:10:22 crc kubenswrapper[4735]: I0223 00:10:22.118067 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4gt4h" event={"ID":"2bac7145-e696-4926-8d9a-de30ef0c6209","Type":"ContainerStarted","Data":"fe19d593922e6b68a3f0792c534718d5e727076776f91118b70de938dc4f2c14"} Feb 23 00:10:22 crc kubenswrapper[4735]: I0223 00:10:22.118158 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-78f64778cd-svhvh" Feb 23 00:10:22 crc kubenswrapper[4735]: I0223 00:10:22.121920 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bzbm8" event={"ID":"c4ec8027-3683-44b2-a91a-a58f49dedbfd","Type":"ContainerStarted","Data":"33c9c97d583242c57ea43e2bcdf0b0184b1877b5b7da03ad55f8b386f53a465c"} Feb 23 00:10:22 crc kubenswrapper[4735]: I0223 00:10:22.124334 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bdqfd" event={"ID":"b542cb9e-35cc-44d9-a850-c41887636c4c","Type":"ContainerStarted","Data":"98bca8e4c7bf96ff251032d231941f5b3678b2dd4f52032dbcf5540bfdeac359"} Feb 23 00:10:22 crc kubenswrapper[4735]: I0223 00:10:22.128430 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-78f64778cd-svhvh" podStartSLOduration=16.128415529 podStartE2EDuration="16.128415529s" podCreationTimestamp="2026-02-23 00:10:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:10:22.125868808 +0000 UTC m=+180.589414789" watchObservedRunningTime="2026-02-23 00:10:22.128415529 +0000 UTC m=+180.591961500" Feb 23 00:10:22 crc kubenswrapper[4735]: I0223 00:10:22.186146 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-bdqfd" podStartSLOduration=154.186127868 podStartE2EDuration="2m34.186127868s" podCreationTimestamp="2026-02-23 00:07:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:10:22.185464223 +0000 UTC m=+180.649010194" watchObservedRunningTime="2026-02-23 00:10:22.186127868 +0000 UTC m=+180.649673839" Feb 23 00:10:22 crc kubenswrapper[4735]: I0223 00:10:22.187578 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bzbm8" podStartSLOduration=3.9869550670000002 podStartE2EDuration="33.187570583s" podCreationTimestamp="2026-02-23 00:09:49 +0000 UTC" firstStartedPulling="2026-02-23 00:09:52.684363494 +0000 UTC m=+151.147909465" lastFinishedPulling="2026-02-23 00:10:21.88497899 +0000 UTC m=+180.348524981" observedRunningTime="2026-02-23 00:10:22.168711 +0000 UTC m=+180.632256971" watchObservedRunningTime="2026-02-23 00:10:22.187570583 +0000 UTC m=+180.651116554" Feb 23 00:10:22 crc kubenswrapper[4735]: I0223 00:10:22.238444 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4gt4h" podStartSLOduration=1.9248572529999999 podStartE2EDuration="31.238423598s" podCreationTimestamp="2026-02-23 00:09:51 +0000 UTC" firstStartedPulling="2026-02-23 00:09:52.629525784 +0000 UTC m=+151.093071755" lastFinishedPulling="2026-02-23 00:10:21.943092119 +0000 UTC m=+180.406638100" observedRunningTime="2026-02-23 00:10:22.233757045 +0000 UTC m=+180.697303016" watchObservedRunningTime="2026-02-23 00:10:22.238423598 +0000 UTC m=+180.701969569" Feb 23 00:10:23 crc kubenswrapper[4735]: I0223 00:10:23.132596 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gcgz5" event={"ID":"78bcce9f-48e6-49c5-8f3f-75d8e156f6bc","Type":"ContainerStarted","Data":"7ade191e2ff5175f673b2498da4760c0106140645bea01b1e3ae45351522f3ea"} Feb 23 00:10:23 crc kubenswrapper[4735]: I0223 00:10:23.135428 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4mrf" event={"ID":"17e19891-1a63-4c0e-abd9-a161f96cf71e","Type":"ContainerStarted","Data":"aa2b5a46ca879fd3df78de833d19749769a40af921bd593f4596f7bec370eaee"} Feb 23 00:10:23 crc kubenswrapper[4735]: I0223 00:10:23.137666 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72lxc" event={"ID":"f444dee5-d7dc-47ed-add6-b3b1148077f2","Type":"ContainerStarted","Data":"36d9ef46f2e7af08966bb2c5cb5cc959ce4264e4905d3d18d1086cd80d1525c5"} Feb 23 00:10:23 crc kubenswrapper[4735]: I0223 00:10:23.159312 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gcgz5" podStartSLOduration=3.733842265 podStartE2EDuration="34.159292863s" podCreationTimestamp="2026-02-23 00:09:49 +0000 UTC" firstStartedPulling="2026-02-23 00:09:51.565498943 +0000 UTC m=+150.029044914" lastFinishedPulling="2026-02-23 00:10:21.990949551 +0000 UTC m=+180.454495512" observedRunningTime="2026-02-23 00:10:23.154270533 +0000 UTC m=+181.617816494" watchObservedRunningTime="2026-02-23 00:10:23.159292863 +0000 UTC m=+181.622838834" Feb 23 00:10:23 crc kubenswrapper[4735]: I0223 00:10:23.174983 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l4mrf" podStartSLOduration=3.747936014 podStartE2EDuration="34.174964251s" podCreationTimestamp="2026-02-23 00:09:49 +0000 UTC" firstStartedPulling="2026-02-23 00:09:51.580277398 +0000 UTC m=+150.043823369" lastFinishedPulling="2026-02-23 00:10:22.007305635 +0000 UTC m=+180.470851606" observedRunningTime="2026-02-23 00:10:23.174082479 +0000 UTC m=+181.637628450" watchObservedRunningTime="2026-02-23 00:10:23.174964251 +0000 UTC m=+181.638510222" Feb 23 00:10:23 crc kubenswrapper[4735]: I0223 00:10:23.197742 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-72lxc" podStartSLOduration=3.609482432 podStartE2EDuration="34.197724809s" podCreationTimestamp="2026-02-23 00:09:49 +0000 UTC" firstStartedPulling="2026-02-23 00:09:51.580265568 +0000 UTC m=+150.043811539" lastFinishedPulling="2026-02-23 00:10:22.168507945 +0000 UTC m=+180.632053916" observedRunningTime="2026-02-23 00:10:23.192592115 +0000 UTC m=+181.656138086" watchObservedRunningTime="2026-02-23 00:10:23.197724809 +0000 UTC m=+181.661270780" Feb 23 00:10:23 crc kubenswrapper[4735]: I0223 00:10:23.452489 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-knx8j" Feb 23 00:10:26 crc kubenswrapper[4735]: I0223 00:10:26.149736 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d9b8d9bd5-xqdx5"] Feb 23 00:10:26 crc kubenswrapper[4735]: I0223 00:10:26.150041 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5d9b8d9bd5-xqdx5" podUID="783cdbda-0825-4dc7-9a0b-d96f81ae9c01" containerName="controller-manager" containerID="cri-o://c13d47f55f80230908391f440f2da0a35fab71d92bb04eccdaf06c63b4b74ddc" gracePeriod=30 Feb 23 00:10:26 crc kubenswrapper[4735]: I0223 00:10:26.243257 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78f64778cd-svhvh"] Feb 23 00:10:26 crc kubenswrapper[4735]: I0223 00:10:26.243430 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-78f64778cd-svhvh" podUID="63bc3dc7-0311-4933-ab26-8420f17c823b" containerName="route-controller-manager" containerID="cri-o://e2216f5ad146a11374d2cb40b0127f8a63c40977de49cecce111c884957d6dbf" gracePeriod=30 Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.166895 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78f64778cd-svhvh" event={"ID":"63bc3dc7-0311-4933-ab26-8420f17c823b","Type":"ContainerDied","Data":"e2216f5ad146a11374d2cb40b0127f8a63c40977de49cecce111c884957d6dbf"} Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.166993 4735 generic.go:334] "Generic (PLEG): container finished" podID="63bc3dc7-0311-4933-ab26-8420f17c823b" containerID="e2216f5ad146a11374d2cb40b0127f8a63c40977de49cecce111c884957d6dbf" exitCode=0 Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.169391 4735 generic.go:334] "Generic (PLEG): container finished" podID="783cdbda-0825-4dc7-9a0b-d96f81ae9c01" containerID="c13d47f55f80230908391f440f2da0a35fab71d92bb04eccdaf06c63b4b74ddc" exitCode=0 Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.169439 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d9b8d9bd5-xqdx5" event={"ID":"783cdbda-0825-4dc7-9a0b-d96f81ae9c01","Type":"ContainerDied","Data":"c13d47f55f80230908391f440f2da0a35fab71d92bb04eccdaf06c63b4b74ddc"} Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.566983 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78f64778cd-svhvh" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.574433 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d9b8d9bd5-xqdx5" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.609145 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-597ccdd7cc-vbgfb"] Feb 23 00:10:27 crc kubenswrapper[4735]: E0223 00:10:27.609611 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cb942f6-2bc5-4662-929c-849ec29baddc" containerName="image-pruner" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.609629 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cb942f6-2bc5-4662-929c-849ec29baddc" containerName="image-pruner" Feb 23 00:10:27 crc kubenswrapper[4735]: E0223 00:10:27.609651 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63bc3dc7-0311-4933-ab26-8420f17c823b" containerName="route-controller-manager" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.609658 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="63bc3dc7-0311-4933-ab26-8420f17c823b" containerName="route-controller-manager" Feb 23 00:10:27 crc kubenswrapper[4735]: E0223 00:10:27.609669 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="783cdbda-0825-4dc7-9a0b-d96f81ae9c01" containerName="controller-manager" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.609678 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="783cdbda-0825-4dc7-9a0b-d96f81ae9c01" containerName="controller-manager" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.609778 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="63bc3dc7-0311-4933-ab26-8420f17c823b" containerName="route-controller-manager" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.609789 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cb942f6-2bc5-4662-929c-849ec29baddc" containerName="image-pruner" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.609801 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="783cdbda-0825-4dc7-9a0b-d96f81ae9c01" containerName="controller-manager" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.610417 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-597ccdd7cc-vbgfb" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.619989 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-597ccdd7cc-vbgfb"] Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.638898 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63bc3dc7-0311-4933-ab26-8420f17c823b-serving-cert\") pod \"63bc3dc7-0311-4933-ab26-8420f17c823b\" (UID: \"63bc3dc7-0311-4933-ab26-8420f17c823b\") " Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.639043 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/783cdbda-0825-4dc7-9a0b-d96f81ae9c01-config\") pod \"783cdbda-0825-4dc7-9a0b-d96f81ae9c01\" (UID: \"783cdbda-0825-4dc7-9a0b-d96f81ae9c01\") " Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.639118 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r27nm\" (UniqueName: \"kubernetes.io/projected/63bc3dc7-0311-4933-ab26-8420f17c823b-kube-api-access-r27nm\") pod \"63bc3dc7-0311-4933-ab26-8420f17c823b\" (UID: \"63bc3dc7-0311-4933-ab26-8420f17c823b\") " Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.639202 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63bc3dc7-0311-4933-ab26-8420f17c823b-config\") pod \"63bc3dc7-0311-4933-ab26-8420f17c823b\" (UID: \"63bc3dc7-0311-4933-ab26-8420f17c823b\") " Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.639270 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgfqw\" (UniqueName: \"kubernetes.io/projected/783cdbda-0825-4dc7-9a0b-d96f81ae9c01-kube-api-access-qgfqw\") pod \"783cdbda-0825-4dc7-9a0b-d96f81ae9c01\" (UID: \"783cdbda-0825-4dc7-9a0b-d96f81ae9c01\") " Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.639341 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/783cdbda-0825-4dc7-9a0b-d96f81ae9c01-client-ca\") pod \"783cdbda-0825-4dc7-9a0b-d96f81ae9c01\" (UID: \"783cdbda-0825-4dc7-9a0b-d96f81ae9c01\") " Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.639369 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63bc3dc7-0311-4933-ab26-8420f17c823b-client-ca\") pod \"63bc3dc7-0311-4933-ab26-8420f17c823b\" (UID: \"63bc3dc7-0311-4933-ab26-8420f17c823b\") " Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.639512 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/783cdbda-0825-4dc7-9a0b-d96f81ae9c01-proxy-ca-bundles\") pod \"783cdbda-0825-4dc7-9a0b-d96f81ae9c01\" (UID: \"783cdbda-0825-4dc7-9a0b-d96f81ae9c01\") " Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.639553 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/783cdbda-0825-4dc7-9a0b-d96f81ae9c01-serving-cert\") pod \"783cdbda-0825-4dc7-9a0b-d96f81ae9c01\" (UID: \"783cdbda-0825-4dc7-9a0b-d96f81ae9c01\") " Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.640482 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/783cdbda-0825-4dc7-9a0b-d96f81ae9c01-client-ca" (OuterVolumeSpecName: "client-ca") pod "783cdbda-0825-4dc7-9a0b-d96f81ae9c01" (UID: "783cdbda-0825-4dc7-9a0b-d96f81ae9c01"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.640668 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63bc3dc7-0311-4933-ab26-8420f17c823b-config" (OuterVolumeSpecName: "config") pod "63bc3dc7-0311-4933-ab26-8420f17c823b" (UID: "63bc3dc7-0311-4933-ab26-8420f17c823b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.641763 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63bc3dc7-0311-4933-ab26-8420f17c823b-client-ca" (OuterVolumeSpecName: "client-ca") pod "63bc3dc7-0311-4933-ab26-8420f17c823b" (UID: "63bc3dc7-0311-4933-ab26-8420f17c823b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.652471 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/783cdbda-0825-4dc7-9a0b-d96f81ae9c01-kube-api-access-qgfqw" (OuterVolumeSpecName: "kube-api-access-qgfqw") pod "783cdbda-0825-4dc7-9a0b-d96f81ae9c01" (UID: "783cdbda-0825-4dc7-9a0b-d96f81ae9c01"). InnerVolumeSpecName "kube-api-access-qgfqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.652620 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63bc3dc7-0311-4933-ab26-8420f17c823b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "63bc3dc7-0311-4933-ab26-8420f17c823b" (UID: "63bc3dc7-0311-4933-ab26-8420f17c823b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.653226 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/783cdbda-0825-4dc7-9a0b-d96f81ae9c01-config" (OuterVolumeSpecName: "config") pod "783cdbda-0825-4dc7-9a0b-d96f81ae9c01" (UID: "783cdbda-0825-4dc7-9a0b-d96f81ae9c01"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.653354 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63bc3dc7-0311-4933-ab26-8420f17c823b-kube-api-access-r27nm" (OuterVolumeSpecName: "kube-api-access-r27nm") pod "63bc3dc7-0311-4933-ab26-8420f17c823b" (UID: "63bc3dc7-0311-4933-ab26-8420f17c823b"). InnerVolumeSpecName "kube-api-access-r27nm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.653481 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/783cdbda-0825-4dc7-9a0b-d96f81ae9c01-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "783cdbda-0825-4dc7-9a0b-d96f81ae9c01" (UID: "783cdbda-0825-4dc7-9a0b-d96f81ae9c01"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.665762 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/783cdbda-0825-4dc7-9a0b-d96f81ae9c01-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "783cdbda-0825-4dc7-9a0b-d96f81ae9c01" (UID: "783cdbda-0825-4dc7-9a0b-d96f81ae9c01"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.741611 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78e0db83-6432-4c4c-bff1-e5134760124e-client-ca\") pod \"route-controller-manager-597ccdd7cc-vbgfb\" (UID: \"78e0db83-6432-4c4c-bff1-e5134760124e\") " pod="openshift-route-controller-manager/route-controller-manager-597ccdd7cc-vbgfb" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.741782 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78e0db83-6432-4c4c-bff1-e5134760124e-config\") pod \"route-controller-manager-597ccdd7cc-vbgfb\" (UID: \"78e0db83-6432-4c4c-bff1-e5134760124e\") " pod="openshift-route-controller-manager/route-controller-manager-597ccdd7cc-vbgfb" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.741837 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78e0db83-6432-4c4c-bff1-e5134760124e-serving-cert\") pod \"route-controller-manager-597ccdd7cc-vbgfb\" (UID: \"78e0db83-6432-4c4c-bff1-e5134760124e\") " pod="openshift-route-controller-manager/route-controller-manager-597ccdd7cc-vbgfb" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.741886 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7snw\" (UniqueName: \"kubernetes.io/projected/78e0db83-6432-4c4c-bff1-e5134760124e-kube-api-access-d7snw\") pod \"route-controller-manager-597ccdd7cc-vbgfb\" (UID: \"78e0db83-6432-4c4c-bff1-e5134760124e\") " pod="openshift-route-controller-manager/route-controller-manager-597ccdd7cc-vbgfb" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.742100 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63bc3dc7-0311-4933-ab26-8420f17c823b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.742147 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/783cdbda-0825-4dc7-9a0b-d96f81ae9c01-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.742165 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r27nm\" (UniqueName: \"kubernetes.io/projected/63bc3dc7-0311-4933-ab26-8420f17c823b-kube-api-access-r27nm\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.742182 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63bc3dc7-0311-4933-ab26-8420f17c823b-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.742194 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgfqw\" (UniqueName: \"kubernetes.io/projected/783cdbda-0825-4dc7-9a0b-d96f81ae9c01-kube-api-access-qgfqw\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.742206 4735 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/783cdbda-0825-4dc7-9a0b-d96f81ae9c01-client-ca\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.742218 4735 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/63bc3dc7-0311-4933-ab26-8420f17c823b-client-ca\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.742229 4735 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/783cdbda-0825-4dc7-9a0b-d96f81ae9c01-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.742241 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/783cdbda-0825-4dc7-9a0b-d96f81ae9c01-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.843673 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78e0db83-6432-4c4c-bff1-e5134760124e-client-ca\") pod \"route-controller-manager-597ccdd7cc-vbgfb\" (UID: \"78e0db83-6432-4c4c-bff1-e5134760124e\") " pod="openshift-route-controller-manager/route-controller-manager-597ccdd7cc-vbgfb" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.844021 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78e0db83-6432-4c4c-bff1-e5134760124e-config\") pod \"route-controller-manager-597ccdd7cc-vbgfb\" (UID: \"78e0db83-6432-4c4c-bff1-e5134760124e\") " pod="openshift-route-controller-manager/route-controller-manager-597ccdd7cc-vbgfb" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.844064 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78e0db83-6432-4c4c-bff1-e5134760124e-serving-cert\") pod \"route-controller-manager-597ccdd7cc-vbgfb\" (UID: \"78e0db83-6432-4c4c-bff1-e5134760124e\") " pod="openshift-route-controller-manager/route-controller-manager-597ccdd7cc-vbgfb" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.844100 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7snw\" (UniqueName: \"kubernetes.io/projected/78e0db83-6432-4c4c-bff1-e5134760124e-kube-api-access-d7snw\") pod \"route-controller-manager-597ccdd7cc-vbgfb\" (UID: \"78e0db83-6432-4c4c-bff1-e5134760124e\") " pod="openshift-route-controller-manager/route-controller-manager-597ccdd7cc-vbgfb" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.845443 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78e0db83-6432-4c4c-bff1-e5134760124e-client-ca\") pod \"route-controller-manager-597ccdd7cc-vbgfb\" (UID: \"78e0db83-6432-4c4c-bff1-e5134760124e\") " pod="openshift-route-controller-manager/route-controller-manager-597ccdd7cc-vbgfb" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.846502 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78e0db83-6432-4c4c-bff1-e5134760124e-config\") pod \"route-controller-manager-597ccdd7cc-vbgfb\" (UID: \"78e0db83-6432-4c4c-bff1-e5134760124e\") " pod="openshift-route-controller-manager/route-controller-manager-597ccdd7cc-vbgfb" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.851317 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78e0db83-6432-4c4c-bff1-e5134760124e-serving-cert\") pod \"route-controller-manager-597ccdd7cc-vbgfb\" (UID: \"78e0db83-6432-4c4c-bff1-e5134760124e\") " pod="openshift-route-controller-manager/route-controller-manager-597ccdd7cc-vbgfb" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.873963 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7snw\" (UniqueName: \"kubernetes.io/projected/78e0db83-6432-4c4c-bff1-e5134760124e-kube-api-access-d7snw\") pod \"route-controller-manager-597ccdd7cc-vbgfb\" (UID: \"78e0db83-6432-4c4c-bff1-e5134760124e\") " pod="openshift-route-controller-manager/route-controller-manager-597ccdd7cc-vbgfb" Feb 23 00:10:27 crc kubenswrapper[4735]: I0223 00:10:27.937374 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-597ccdd7cc-vbgfb" Feb 23 00:10:28 crc kubenswrapper[4735]: I0223 00:10:28.177993 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d9b8d9bd5-xqdx5" event={"ID":"783cdbda-0825-4dc7-9a0b-d96f81ae9c01","Type":"ContainerDied","Data":"588fcccdcba224a44160730adfe972836104bf38f8938f7f90b4691e4b1f6451"} Feb 23 00:10:28 crc kubenswrapper[4735]: I0223 00:10:28.178010 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d9b8d9bd5-xqdx5" Feb 23 00:10:28 crc kubenswrapper[4735]: I0223 00:10:28.178107 4735 scope.go:117] "RemoveContainer" containerID="c13d47f55f80230908391f440f2da0a35fab71d92bb04eccdaf06c63b4b74ddc" Feb 23 00:10:28 crc kubenswrapper[4735]: I0223 00:10:28.183570 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78f64778cd-svhvh" event={"ID":"63bc3dc7-0311-4933-ab26-8420f17c823b","Type":"ContainerDied","Data":"5f47dd405ff3645b8548a7cc3b2a9ea9d91f9a020f1e9a4b0c2bc46ddb759627"} Feb 23 00:10:28 crc kubenswrapper[4735]: I0223 00:10:28.183623 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78f64778cd-svhvh" Feb 23 00:10:28 crc kubenswrapper[4735]: I0223 00:10:28.208831 4735 scope.go:117] "RemoveContainer" containerID="e2216f5ad146a11374d2cb40b0127f8a63c40977de49cecce111c884957d6dbf" Feb 23 00:10:28 crc kubenswrapper[4735]: I0223 00:10:28.209146 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d9b8d9bd5-xqdx5"] Feb 23 00:10:28 crc kubenswrapper[4735]: I0223 00:10:28.212578 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5d9b8d9bd5-xqdx5"] Feb 23 00:10:28 crc kubenswrapper[4735]: I0223 00:10:28.224003 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78f64778cd-svhvh"] Feb 23 00:10:28 crc kubenswrapper[4735]: I0223 00:10:28.227304 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78f64778cd-svhvh"] Feb 23 00:10:28 crc kubenswrapper[4735]: I0223 00:10:28.280659 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63bc3dc7-0311-4933-ab26-8420f17c823b" path="/var/lib/kubelet/pods/63bc3dc7-0311-4933-ab26-8420f17c823b/volumes" Feb 23 00:10:28 crc kubenswrapper[4735]: I0223 00:10:28.281434 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="783cdbda-0825-4dc7-9a0b-d96f81ae9c01" path="/var/lib/kubelet/pods/783cdbda-0825-4dc7-9a0b-d96f81ae9c01/volumes" Feb 23 00:10:28 crc kubenswrapper[4735]: I0223 00:10:28.329527 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-597ccdd7cc-vbgfb"] Feb 23 00:10:28 crc kubenswrapper[4735]: W0223 00:10:28.333820 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78e0db83_6432_4c4c_bff1_e5134760124e.slice/crio-9e97097d3b2f1f8c5152a8e4344ef9c74a7d230473dd06c9e7b1b103e0482b74 WatchSource:0}: Error finding container 9e97097d3b2f1f8c5152a8e4344ef9c74a7d230473dd06c9e7b1b103e0482b74: Status 404 returned error can't find the container with id 9e97097d3b2f1f8c5152a8e4344ef9c74a7d230473dd06c9e7b1b103e0482b74 Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.192546 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-597ccdd7cc-vbgfb" event={"ID":"78e0db83-6432-4c4c-bff1-e5134760124e","Type":"ContainerStarted","Data":"46d1076fb3c645a22bde2440950984f2d329f8b483459c85ff00efacec7c619e"} Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.192600 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-597ccdd7cc-vbgfb" event={"ID":"78e0db83-6432-4c4c-bff1-e5134760124e","Type":"ContainerStarted","Data":"9e97097d3b2f1f8c5152a8e4344ef9c74a7d230473dd06c9e7b1b103e0482b74"} Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.192798 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-597ccdd7cc-vbgfb" Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.198534 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-597ccdd7cc-vbgfb" Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.214722 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-597ccdd7cc-vbgfb" podStartSLOduration=3.21470074 podStartE2EDuration="3.21470074s" podCreationTimestamp="2026-02-23 00:10:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:10:29.213259076 +0000 UTC m=+187.676805077" watchObservedRunningTime="2026-02-23 00:10:29.21470074 +0000 UTC m=+187.678246721" Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.429556 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l4mrf" Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.429607 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l4mrf" Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.666515 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-72lxc" Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.666577 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-72lxc" Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.827560 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-cb6b46446-77nq4"] Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.828334 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cb6b46446-77nq4" Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.830370 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.830755 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.830784 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.830951 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.831037 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.831051 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.840067 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.842165 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-cb6b46446-77nq4"] Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.869218 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bzbm8" Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.869267 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bzbm8" Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.870276 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/756348da-76db-4458-a280-488a6c79756a-client-ca\") pod \"controller-manager-cb6b46446-77nq4\" (UID: \"756348da-76db-4458-a280-488a6c79756a\") " pod="openshift-controller-manager/controller-manager-cb6b46446-77nq4" Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.870312 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/756348da-76db-4458-a280-488a6c79756a-config\") pod \"controller-manager-cb6b46446-77nq4\" (UID: \"756348da-76db-4458-a280-488a6c79756a\") " pod="openshift-controller-manager/controller-manager-cb6b46446-77nq4" Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.870375 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/756348da-76db-4458-a280-488a6c79756a-serving-cert\") pod \"controller-manager-cb6b46446-77nq4\" (UID: \"756348da-76db-4458-a280-488a6c79756a\") " pod="openshift-controller-manager/controller-manager-cb6b46446-77nq4" Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.870450 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/756348da-76db-4458-a280-488a6c79756a-proxy-ca-bundles\") pod \"controller-manager-cb6b46446-77nq4\" (UID: \"756348da-76db-4458-a280-488a6c79756a\") " pod="openshift-controller-manager/controller-manager-cb6b46446-77nq4" Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.870662 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8shrv\" (UniqueName: \"kubernetes.io/projected/756348da-76db-4458-a280-488a6c79756a-kube-api-access-8shrv\") pod \"controller-manager-cb6b46446-77nq4\" (UID: \"756348da-76db-4458-a280-488a6c79756a\") " pod="openshift-controller-manager/controller-manager-cb6b46446-77nq4" Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.944252 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bzbm8" Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.944333 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l4mrf" Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.947093 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-72lxc" Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.972023 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8shrv\" (UniqueName: \"kubernetes.io/projected/756348da-76db-4458-a280-488a6c79756a-kube-api-access-8shrv\") pod \"controller-manager-cb6b46446-77nq4\" (UID: \"756348da-76db-4458-a280-488a6c79756a\") " pod="openshift-controller-manager/controller-manager-cb6b46446-77nq4" Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.972108 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/756348da-76db-4458-a280-488a6c79756a-client-ca\") pod \"controller-manager-cb6b46446-77nq4\" (UID: \"756348da-76db-4458-a280-488a6c79756a\") " pod="openshift-controller-manager/controller-manager-cb6b46446-77nq4" Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.972186 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/756348da-76db-4458-a280-488a6c79756a-config\") pod \"controller-manager-cb6b46446-77nq4\" (UID: \"756348da-76db-4458-a280-488a6c79756a\") " pod="openshift-controller-manager/controller-manager-cb6b46446-77nq4" Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.972222 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/756348da-76db-4458-a280-488a6c79756a-serving-cert\") pod \"controller-manager-cb6b46446-77nq4\" (UID: \"756348da-76db-4458-a280-488a6c79756a\") " pod="openshift-controller-manager/controller-manager-cb6b46446-77nq4" Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.972290 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/756348da-76db-4458-a280-488a6c79756a-proxy-ca-bundles\") pod \"controller-manager-cb6b46446-77nq4\" (UID: \"756348da-76db-4458-a280-488a6c79756a\") " pod="openshift-controller-manager/controller-manager-cb6b46446-77nq4" Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.972997 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/756348da-76db-4458-a280-488a6c79756a-client-ca\") pod \"controller-manager-cb6b46446-77nq4\" (UID: \"756348da-76db-4458-a280-488a6c79756a\") " pod="openshift-controller-manager/controller-manager-cb6b46446-77nq4" Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.974510 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/756348da-76db-4458-a280-488a6c79756a-proxy-ca-bundles\") pod \"controller-manager-cb6b46446-77nq4\" (UID: \"756348da-76db-4458-a280-488a6c79756a\") " pod="openshift-controller-manager/controller-manager-cb6b46446-77nq4" Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.974841 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/756348da-76db-4458-a280-488a6c79756a-config\") pod \"controller-manager-cb6b46446-77nq4\" (UID: \"756348da-76db-4458-a280-488a6c79756a\") " pod="openshift-controller-manager/controller-manager-cb6b46446-77nq4" Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.978240 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/756348da-76db-4458-a280-488a6c79756a-serving-cert\") pod \"controller-manager-cb6b46446-77nq4\" (UID: \"756348da-76db-4458-a280-488a6c79756a\") " pod="openshift-controller-manager/controller-manager-cb6b46446-77nq4" Feb 23 00:10:29 crc kubenswrapper[4735]: I0223 00:10:29.994900 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8shrv\" (UniqueName: \"kubernetes.io/projected/756348da-76db-4458-a280-488a6c79756a-kube-api-access-8shrv\") pod \"controller-manager-cb6b46446-77nq4\" (UID: \"756348da-76db-4458-a280-488a6c79756a\") " pod="openshift-controller-manager/controller-manager-cb6b46446-77nq4" Feb 23 00:10:30 crc kubenswrapper[4735]: I0223 00:10:30.006787 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 23 00:10:30 crc kubenswrapper[4735]: I0223 00:10:30.115011 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gcgz5" Feb 23 00:10:30 crc kubenswrapper[4735]: I0223 00:10:30.115088 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gcgz5" Feb 23 00:10:30 crc kubenswrapper[4735]: I0223 00:10:30.154721 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gcgz5" Feb 23 00:10:30 crc kubenswrapper[4735]: I0223 00:10:30.154864 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cb6b46446-77nq4" Feb 23 00:10:30 crc kubenswrapper[4735]: I0223 00:10:30.247584 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l4mrf" Feb 23 00:10:30 crc kubenswrapper[4735]: I0223 00:10:30.247652 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bzbm8" Feb 23 00:10:30 crc kubenswrapper[4735]: I0223 00:10:30.253074 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-72lxc" Feb 23 00:10:30 crc kubenswrapper[4735]: I0223 00:10:30.253162 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gcgz5" Feb 23 00:10:30 crc kubenswrapper[4735]: I0223 00:10:30.595479 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-cb6b46446-77nq4"] Feb 23 00:10:31 crc kubenswrapper[4735]: I0223 00:10:31.207790 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cb6b46446-77nq4" event={"ID":"756348da-76db-4458-a280-488a6c79756a","Type":"ContainerStarted","Data":"704c75a1c565a4f56b1ecfc0889f91c9e80230550fe1b31c34f734d79de23206"} Feb 23 00:10:31 crc kubenswrapper[4735]: I0223 00:10:31.207841 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cb6b46446-77nq4" event={"ID":"756348da-76db-4458-a280-488a6c79756a","Type":"ContainerStarted","Data":"ed8af5d1625b7f79bdb98f6c6340a5d96e29e7946ff573099b77ded79b50d304"} Feb 23 00:10:31 crc kubenswrapper[4735]: I0223 00:10:31.228942 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-cb6b46446-77nq4" podStartSLOduration=5.228915104 podStartE2EDuration="5.228915104s" podCreationTimestamp="2026-02-23 00:10:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:10:31.226952647 +0000 UTC m=+189.690498618" watchObservedRunningTime="2026-02-23 00:10:31.228915104 +0000 UTC m=+189.692461115" Feb 23 00:10:31 crc kubenswrapper[4735]: I0223 00:10:31.594649 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4gt4h" Feb 23 00:10:31 crc kubenswrapper[4735]: I0223 00:10:31.594962 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4gt4h" Feb 23 00:10:31 crc kubenswrapper[4735]: I0223 00:10:31.659672 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4gt4h" Feb 23 00:10:31 crc kubenswrapper[4735]: I0223 00:10:31.980223 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gcgz5"] Feb 23 00:10:32 crc kubenswrapper[4735]: I0223 00:10:32.229224 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gcgz5" podUID="78bcce9f-48e6-49c5-8f3f-75d8e156f6bc" containerName="registry-server" containerID="cri-o://7ade191e2ff5175f673b2498da4760c0106140645bea01b1e3ae45351522f3ea" gracePeriod=2 Feb 23 00:10:32 crc kubenswrapper[4735]: I0223 00:10:32.229949 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvwmg" event={"ID":"703af3dd-f895-4e96-991a-7e8a405bb03e","Type":"ContainerStarted","Data":"1e42f25e19e712b80a9bf6a46633d1adb8ee6662e617935da0a81cff57384d3f"} Feb 23 00:10:32 crc kubenswrapper[4735]: I0223 00:10:32.230013 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-cb6b46446-77nq4" Feb 23 00:10:32 crc kubenswrapper[4735]: I0223 00:10:32.238287 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-cb6b46446-77nq4" Feb 23 00:10:32 crc kubenswrapper[4735]: I0223 00:10:32.270549 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4gt4h" Feb 23 00:10:32 crc kubenswrapper[4735]: I0223 00:10:32.576399 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bzbm8"] Feb 23 00:10:32 crc kubenswrapper[4735]: I0223 00:10:32.577002 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bzbm8" podUID="c4ec8027-3683-44b2-a91a-a58f49dedbfd" containerName="registry-server" containerID="cri-o://33c9c97d583242c57ea43e2bcdf0b0184b1877b5b7da03ad55f8b386f53a465c" gracePeriod=2 Feb 23 00:10:32 crc kubenswrapper[4735]: I0223 00:10:32.598023 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gcgz5" Feb 23 00:10:32 crc kubenswrapper[4735]: I0223 00:10:32.712661 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpqv4\" (UniqueName: \"kubernetes.io/projected/78bcce9f-48e6-49c5-8f3f-75d8e156f6bc-kube-api-access-bpqv4\") pod \"78bcce9f-48e6-49c5-8f3f-75d8e156f6bc\" (UID: \"78bcce9f-48e6-49c5-8f3f-75d8e156f6bc\") " Feb 23 00:10:32 crc kubenswrapper[4735]: I0223 00:10:32.712780 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78bcce9f-48e6-49c5-8f3f-75d8e156f6bc-utilities\") pod \"78bcce9f-48e6-49c5-8f3f-75d8e156f6bc\" (UID: \"78bcce9f-48e6-49c5-8f3f-75d8e156f6bc\") " Feb 23 00:10:32 crc kubenswrapper[4735]: I0223 00:10:32.712826 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78bcce9f-48e6-49c5-8f3f-75d8e156f6bc-catalog-content\") pod \"78bcce9f-48e6-49c5-8f3f-75d8e156f6bc\" (UID: \"78bcce9f-48e6-49c5-8f3f-75d8e156f6bc\") " Feb 23 00:10:32 crc kubenswrapper[4735]: I0223 00:10:32.713679 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78bcce9f-48e6-49c5-8f3f-75d8e156f6bc-utilities" (OuterVolumeSpecName: "utilities") pod "78bcce9f-48e6-49c5-8f3f-75d8e156f6bc" (UID: "78bcce9f-48e6-49c5-8f3f-75d8e156f6bc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:10:32 crc kubenswrapper[4735]: I0223 00:10:32.717651 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78bcce9f-48e6-49c5-8f3f-75d8e156f6bc-kube-api-access-bpqv4" (OuterVolumeSpecName: "kube-api-access-bpqv4") pod "78bcce9f-48e6-49c5-8f3f-75d8e156f6bc" (UID: "78bcce9f-48e6-49c5-8f3f-75d8e156f6bc"). InnerVolumeSpecName "kube-api-access-bpqv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:10:32 crc kubenswrapper[4735]: I0223 00:10:32.769586 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78bcce9f-48e6-49c5-8f3f-75d8e156f6bc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "78bcce9f-48e6-49c5-8f3f-75d8e156f6bc" (UID: "78bcce9f-48e6-49c5-8f3f-75d8e156f6bc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:10:32 crc kubenswrapper[4735]: I0223 00:10:32.814490 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpqv4\" (UniqueName: \"kubernetes.io/projected/78bcce9f-48e6-49c5-8f3f-75d8e156f6bc-kube-api-access-bpqv4\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:32 crc kubenswrapper[4735]: I0223 00:10:32.814543 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78bcce9f-48e6-49c5-8f3f-75d8e156f6bc-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:32 crc kubenswrapper[4735]: I0223 00:10:32.814557 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78bcce9f-48e6-49c5-8f3f-75d8e156f6bc-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:32 crc kubenswrapper[4735]: I0223 00:10:32.941200 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bzbm8" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.017282 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ec8027-3683-44b2-a91a-a58f49dedbfd-utilities\") pod \"c4ec8027-3683-44b2-a91a-a58f49dedbfd\" (UID: \"c4ec8027-3683-44b2-a91a-a58f49dedbfd\") " Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.017622 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ec8027-3683-44b2-a91a-a58f49dedbfd-catalog-content\") pod \"c4ec8027-3683-44b2-a91a-a58f49dedbfd\" (UID: \"c4ec8027-3683-44b2-a91a-a58f49dedbfd\") " Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.017775 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nvrz\" (UniqueName: \"kubernetes.io/projected/c4ec8027-3683-44b2-a91a-a58f49dedbfd-kube-api-access-6nvrz\") pod \"c4ec8027-3683-44b2-a91a-a58f49dedbfd\" (UID: \"c4ec8027-3683-44b2-a91a-a58f49dedbfd\") " Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.024092 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4ec8027-3683-44b2-a91a-a58f49dedbfd-kube-api-access-6nvrz" (OuterVolumeSpecName: "kube-api-access-6nvrz") pod "c4ec8027-3683-44b2-a91a-a58f49dedbfd" (UID: "c4ec8027-3683-44b2-a91a-a58f49dedbfd"). InnerVolumeSpecName "kube-api-access-6nvrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.025673 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4ec8027-3683-44b2-a91a-a58f49dedbfd-utilities" (OuterVolumeSpecName: "utilities") pod "c4ec8027-3683-44b2-a91a-a58f49dedbfd" (UID: "c4ec8027-3683-44b2-a91a-a58f49dedbfd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.098517 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4ec8027-3683-44b2-a91a-a58f49dedbfd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c4ec8027-3683-44b2-a91a-a58f49dedbfd" (UID: "c4ec8027-3683-44b2-a91a-a58f49dedbfd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.120196 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nvrz\" (UniqueName: \"kubernetes.io/projected/c4ec8027-3683-44b2-a91a-a58f49dedbfd-kube-api-access-6nvrz\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.120251 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ec8027-3683-44b2-a91a-a58f49dedbfd-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.120272 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ec8027-3683-44b2-a91a-a58f49dedbfd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.240065 4735 generic.go:334] "Generic (PLEG): container finished" podID="703af3dd-f895-4e96-991a-7e8a405bb03e" containerID="1e42f25e19e712b80a9bf6a46633d1adb8ee6662e617935da0a81cff57384d3f" exitCode=0 Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.240179 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvwmg" event={"ID":"703af3dd-f895-4e96-991a-7e8a405bb03e","Type":"ContainerDied","Data":"1e42f25e19e712b80a9bf6a46633d1adb8ee6662e617935da0a81cff57384d3f"} Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.243839 4735 generic.go:334] "Generic (PLEG): container finished" podID="78bcce9f-48e6-49c5-8f3f-75d8e156f6bc" containerID="7ade191e2ff5175f673b2498da4760c0106140645bea01b1e3ae45351522f3ea" exitCode=0 Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.243977 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gcgz5" event={"ID":"78bcce9f-48e6-49c5-8f3f-75d8e156f6bc","Type":"ContainerDied","Data":"7ade191e2ff5175f673b2498da4760c0106140645bea01b1e3ae45351522f3ea"} Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.244021 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gcgz5" event={"ID":"78bcce9f-48e6-49c5-8f3f-75d8e156f6bc","Type":"ContainerDied","Data":"316cfff5800b138d553bf66a05e7880d2e3155c1b78fa3da05e4a41d28d0b2ff"} Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.244051 4735 scope.go:117] "RemoveContainer" containerID="7ade191e2ff5175f673b2498da4760c0106140645bea01b1e3ae45351522f3ea" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.244235 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gcgz5" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.262503 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bzbm8" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.262580 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bzbm8" event={"ID":"c4ec8027-3683-44b2-a91a-a58f49dedbfd","Type":"ContainerDied","Data":"33c9c97d583242c57ea43e2bcdf0b0184b1877b5b7da03ad55f8b386f53a465c"} Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.271537 4735 generic.go:334] "Generic (PLEG): container finished" podID="c4ec8027-3683-44b2-a91a-a58f49dedbfd" containerID="33c9c97d583242c57ea43e2bcdf0b0184b1877b5b7da03ad55f8b386f53a465c" exitCode=0 Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.274066 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bzbm8" event={"ID":"c4ec8027-3683-44b2-a91a-a58f49dedbfd","Type":"ContainerDied","Data":"90856c0643b608717c06400d36d7b1e98305cd45913327c9412d8b0b23d68ba9"} Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.282281 4735 scope.go:117] "RemoveContainer" containerID="c1bf000a13b884870070b9ea0bc8f2c7160ef753770485ce7e4289bfc33ef70e" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.328948 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gcgz5"] Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.337770 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gcgz5"] Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.344065 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bzbm8"] Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.344233 4735 scope.go:117] "RemoveContainer" containerID="075a64faff6b67ef7d4a1d9454f9226618ac992b395f4daa76577f30908af33d" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.348272 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bzbm8"] Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.364095 4735 scope.go:117] "RemoveContainer" containerID="7ade191e2ff5175f673b2498da4760c0106140645bea01b1e3ae45351522f3ea" Feb 23 00:10:33 crc kubenswrapper[4735]: E0223 00:10:33.364640 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ade191e2ff5175f673b2498da4760c0106140645bea01b1e3ae45351522f3ea\": container with ID starting with 7ade191e2ff5175f673b2498da4760c0106140645bea01b1e3ae45351522f3ea not found: ID does not exist" containerID="7ade191e2ff5175f673b2498da4760c0106140645bea01b1e3ae45351522f3ea" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.364689 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ade191e2ff5175f673b2498da4760c0106140645bea01b1e3ae45351522f3ea"} err="failed to get container status \"7ade191e2ff5175f673b2498da4760c0106140645bea01b1e3ae45351522f3ea\": rpc error: code = NotFound desc = could not find container \"7ade191e2ff5175f673b2498da4760c0106140645bea01b1e3ae45351522f3ea\": container with ID starting with 7ade191e2ff5175f673b2498da4760c0106140645bea01b1e3ae45351522f3ea not found: ID does not exist" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.364774 4735 scope.go:117] "RemoveContainer" containerID="c1bf000a13b884870070b9ea0bc8f2c7160ef753770485ce7e4289bfc33ef70e" Feb 23 00:10:33 crc kubenswrapper[4735]: E0223 00:10:33.365119 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1bf000a13b884870070b9ea0bc8f2c7160ef753770485ce7e4289bfc33ef70e\": container with ID starting with c1bf000a13b884870070b9ea0bc8f2c7160ef753770485ce7e4289bfc33ef70e not found: ID does not exist" containerID="c1bf000a13b884870070b9ea0bc8f2c7160ef753770485ce7e4289bfc33ef70e" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.365143 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1bf000a13b884870070b9ea0bc8f2c7160ef753770485ce7e4289bfc33ef70e"} err="failed to get container status \"c1bf000a13b884870070b9ea0bc8f2c7160ef753770485ce7e4289bfc33ef70e\": rpc error: code = NotFound desc = could not find container \"c1bf000a13b884870070b9ea0bc8f2c7160ef753770485ce7e4289bfc33ef70e\": container with ID starting with c1bf000a13b884870070b9ea0bc8f2c7160ef753770485ce7e4289bfc33ef70e not found: ID does not exist" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.365162 4735 scope.go:117] "RemoveContainer" containerID="075a64faff6b67ef7d4a1d9454f9226618ac992b395f4daa76577f30908af33d" Feb 23 00:10:33 crc kubenswrapper[4735]: E0223 00:10:33.365439 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"075a64faff6b67ef7d4a1d9454f9226618ac992b395f4daa76577f30908af33d\": container with ID starting with 075a64faff6b67ef7d4a1d9454f9226618ac992b395f4daa76577f30908af33d not found: ID does not exist" containerID="075a64faff6b67ef7d4a1d9454f9226618ac992b395f4daa76577f30908af33d" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.365470 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"075a64faff6b67ef7d4a1d9454f9226618ac992b395f4daa76577f30908af33d"} err="failed to get container status \"075a64faff6b67ef7d4a1d9454f9226618ac992b395f4daa76577f30908af33d\": rpc error: code = NotFound desc = could not find container \"075a64faff6b67ef7d4a1d9454f9226618ac992b395f4daa76577f30908af33d\": container with ID starting with 075a64faff6b67ef7d4a1d9454f9226618ac992b395f4daa76577f30908af33d not found: ID does not exist" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.365488 4735 scope.go:117] "RemoveContainer" containerID="33c9c97d583242c57ea43e2bcdf0b0184b1877b5b7da03ad55f8b386f53a465c" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.385381 4735 scope.go:117] "RemoveContainer" containerID="847f4ae6264b59f732798f4afd3c15bf21d28f56479df92fd29ec995c8173df9" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.407729 4735 scope.go:117] "RemoveContainer" containerID="2e1a361143ec447e2ad086d99d64b28ef90ea60462ce1fc4cfa11d8b8699c7ca" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.432320 4735 scope.go:117] "RemoveContainer" containerID="33c9c97d583242c57ea43e2bcdf0b0184b1877b5b7da03ad55f8b386f53a465c" Feb 23 00:10:33 crc kubenswrapper[4735]: E0223 00:10:33.435814 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33c9c97d583242c57ea43e2bcdf0b0184b1877b5b7da03ad55f8b386f53a465c\": container with ID starting with 33c9c97d583242c57ea43e2bcdf0b0184b1877b5b7da03ad55f8b386f53a465c not found: ID does not exist" containerID="33c9c97d583242c57ea43e2bcdf0b0184b1877b5b7da03ad55f8b386f53a465c" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.435982 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33c9c97d583242c57ea43e2bcdf0b0184b1877b5b7da03ad55f8b386f53a465c"} err="failed to get container status \"33c9c97d583242c57ea43e2bcdf0b0184b1877b5b7da03ad55f8b386f53a465c\": rpc error: code = NotFound desc = could not find container \"33c9c97d583242c57ea43e2bcdf0b0184b1877b5b7da03ad55f8b386f53a465c\": container with ID starting with 33c9c97d583242c57ea43e2bcdf0b0184b1877b5b7da03ad55f8b386f53a465c not found: ID does not exist" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.436016 4735 scope.go:117] "RemoveContainer" containerID="847f4ae6264b59f732798f4afd3c15bf21d28f56479df92fd29ec995c8173df9" Feb 23 00:10:33 crc kubenswrapper[4735]: E0223 00:10:33.437359 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"847f4ae6264b59f732798f4afd3c15bf21d28f56479df92fd29ec995c8173df9\": container with ID starting with 847f4ae6264b59f732798f4afd3c15bf21d28f56479df92fd29ec995c8173df9 not found: ID does not exist" containerID="847f4ae6264b59f732798f4afd3c15bf21d28f56479df92fd29ec995c8173df9" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.437391 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"847f4ae6264b59f732798f4afd3c15bf21d28f56479df92fd29ec995c8173df9"} err="failed to get container status \"847f4ae6264b59f732798f4afd3c15bf21d28f56479df92fd29ec995c8173df9\": rpc error: code = NotFound desc = could not find container \"847f4ae6264b59f732798f4afd3c15bf21d28f56479df92fd29ec995c8173df9\": container with ID starting with 847f4ae6264b59f732798f4afd3c15bf21d28f56479df92fd29ec995c8173df9 not found: ID does not exist" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.437414 4735 scope.go:117] "RemoveContainer" containerID="2e1a361143ec447e2ad086d99d64b28ef90ea60462ce1fc4cfa11d8b8699c7ca" Feb 23 00:10:33 crc kubenswrapper[4735]: E0223 00:10:33.437837 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e1a361143ec447e2ad086d99d64b28ef90ea60462ce1fc4cfa11d8b8699c7ca\": container with ID starting with 2e1a361143ec447e2ad086d99d64b28ef90ea60462ce1fc4cfa11d8b8699c7ca not found: ID does not exist" containerID="2e1a361143ec447e2ad086d99d64b28ef90ea60462ce1fc4cfa11d8b8699c7ca" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.437918 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e1a361143ec447e2ad086d99d64b28ef90ea60462ce1fc4cfa11d8b8699c7ca"} err="failed to get container status \"2e1a361143ec447e2ad086d99d64b28ef90ea60462ce1fc4cfa11d8b8699c7ca\": rpc error: code = NotFound desc = could not find container \"2e1a361143ec447e2ad086d99d64b28ef90ea60462ce1fc4cfa11d8b8699c7ca\": container with ID starting with 2e1a361143ec447e2ad086d99d64b28ef90ea60462ce1fc4cfa11d8b8699c7ca not found: ID does not exist" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.655305 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 23 00:10:33 crc kubenswrapper[4735]: E0223 00:10:33.655907 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78bcce9f-48e6-49c5-8f3f-75d8e156f6bc" containerName="extract-content" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.655922 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="78bcce9f-48e6-49c5-8f3f-75d8e156f6bc" containerName="extract-content" Feb 23 00:10:33 crc kubenswrapper[4735]: E0223 00:10:33.655935 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78bcce9f-48e6-49c5-8f3f-75d8e156f6bc" containerName="extract-utilities" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.655943 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="78bcce9f-48e6-49c5-8f3f-75d8e156f6bc" containerName="extract-utilities" Feb 23 00:10:33 crc kubenswrapper[4735]: E0223 00:10:33.655955 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ec8027-3683-44b2-a91a-a58f49dedbfd" containerName="extract-utilities" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.655964 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ec8027-3683-44b2-a91a-a58f49dedbfd" containerName="extract-utilities" Feb 23 00:10:33 crc kubenswrapper[4735]: E0223 00:10:33.655977 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78bcce9f-48e6-49c5-8f3f-75d8e156f6bc" containerName="registry-server" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.655985 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="78bcce9f-48e6-49c5-8f3f-75d8e156f6bc" containerName="registry-server" Feb 23 00:10:33 crc kubenswrapper[4735]: E0223 00:10:33.656002 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ec8027-3683-44b2-a91a-a58f49dedbfd" containerName="extract-content" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.656010 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ec8027-3683-44b2-a91a-a58f49dedbfd" containerName="extract-content" Feb 23 00:10:33 crc kubenswrapper[4735]: E0223 00:10:33.656019 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ec8027-3683-44b2-a91a-a58f49dedbfd" containerName="registry-server" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.656027 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ec8027-3683-44b2-a91a-a58f49dedbfd" containerName="registry-server" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.656143 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4ec8027-3683-44b2-a91a-a58f49dedbfd" containerName="registry-server" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.656156 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="78bcce9f-48e6-49c5-8f3f-75d8e156f6bc" containerName="registry-server" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.656574 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.658379 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.659366 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.728382 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.833833 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/689eee0d-6f33-47e0-bb6d-8c23d3cca3d6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"689eee0d-6f33-47e0-bb6d-8c23d3cca3d6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.833986 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/689eee0d-6f33-47e0-bb6d-8c23d3cca3d6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"689eee0d-6f33-47e0-bb6d-8c23d3cca3d6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.935530 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/689eee0d-6f33-47e0-bb6d-8c23d3cca3d6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"689eee0d-6f33-47e0-bb6d-8c23d3cca3d6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.935632 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/689eee0d-6f33-47e0-bb6d-8c23d3cca3d6-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"689eee0d-6f33-47e0-bb6d-8c23d3cca3d6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.935681 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/689eee0d-6f33-47e0-bb6d-8c23d3cca3d6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"689eee0d-6f33-47e0-bb6d-8c23d3cca3d6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 00:10:33 crc kubenswrapper[4735]: I0223 00:10:33.958878 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/689eee0d-6f33-47e0-bb6d-8c23d3cca3d6-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"689eee0d-6f33-47e0-bb6d-8c23d3cca3d6\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 00:10:34 crc kubenswrapper[4735]: I0223 00:10:34.033259 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 00:10:34 crc kubenswrapper[4735]: I0223 00:10:34.278618 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78bcce9f-48e6-49c5-8f3f-75d8e156f6bc" path="/var/lib/kubelet/pods/78bcce9f-48e6-49c5-8f3f-75d8e156f6bc/volumes" Feb 23 00:10:34 crc kubenswrapper[4735]: I0223 00:10:34.279463 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4ec8027-3683-44b2-a91a-a58f49dedbfd" path="/var/lib/kubelet/pods/c4ec8027-3683-44b2-a91a-a58f49dedbfd/volumes" Feb 23 00:10:34 crc kubenswrapper[4735]: I0223 00:10:34.284896 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvwmg" event={"ID":"703af3dd-f895-4e96-991a-7e8a405bb03e","Type":"ContainerStarted","Data":"06b091925f2c8a3efa1932e7f2e45fede462a9935580ff5a44b489dc05f58547"} Feb 23 00:10:34 crc kubenswrapper[4735]: I0223 00:10:34.302688 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 23 00:10:34 crc kubenswrapper[4735]: I0223 00:10:34.315409 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mvwmg" podStartSLOduration=2.33016726 podStartE2EDuration="42.315392537s" podCreationTimestamp="2026-02-23 00:09:52 +0000 UTC" firstStartedPulling="2026-02-23 00:09:53.742102994 +0000 UTC m=+152.205648965" lastFinishedPulling="2026-02-23 00:10:33.727328261 +0000 UTC m=+192.190874242" observedRunningTime="2026-02-23 00:10:34.313896591 +0000 UTC m=+192.777442562" watchObservedRunningTime="2026-02-23 00:10:34.315392537 +0000 UTC m=+192.778938508" Feb 23 00:10:34 crc kubenswrapper[4735]: W0223 00:10:34.318352 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod689eee0d_6f33_47e0_bb6d_8c23d3cca3d6.slice/crio-d2de45cd07dbf1cd28d80f4ecec7312dc2e266c1cddb5e1223e4e34a3e07f351 WatchSource:0}: Error finding container d2de45cd07dbf1cd28d80f4ecec7312dc2e266c1cddb5e1223e4e34a3e07f351: Status 404 returned error can't find the container with id d2de45cd07dbf1cd28d80f4ecec7312dc2e266c1cddb5e1223e4e34a3e07f351 Feb 23 00:10:35 crc kubenswrapper[4735]: I0223 00:10:35.292461 4735 generic.go:334] "Generic (PLEG): container finished" podID="6b4dd4f4-5ead-4f6f-a993-34effb6df863" containerID="bd9f026fad72802ff92fbd116725a4cfd5106959505c59780ea9a34da5e19436" exitCode=0 Feb 23 00:10:35 crc kubenswrapper[4735]: I0223 00:10:35.292550 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5xl8z" event={"ID":"6b4dd4f4-5ead-4f6f-a993-34effb6df863","Type":"ContainerDied","Data":"bd9f026fad72802ff92fbd116725a4cfd5106959505c59780ea9a34da5e19436"} Feb 23 00:10:35 crc kubenswrapper[4735]: I0223 00:10:35.299098 4735 generic.go:334] "Generic (PLEG): container finished" podID="689eee0d-6f33-47e0-bb6d-8c23d3cca3d6" containerID="9bc5f1e32c6f6437e29184b06542570bb13e0fd39f4a855954c8b95bcc6da04c" exitCode=0 Feb 23 00:10:35 crc kubenswrapper[4735]: I0223 00:10:35.299194 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"689eee0d-6f33-47e0-bb6d-8c23d3cca3d6","Type":"ContainerDied","Data":"9bc5f1e32c6f6437e29184b06542570bb13e0fd39f4a855954c8b95bcc6da04c"} Feb 23 00:10:35 crc kubenswrapper[4735]: I0223 00:10:35.299240 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"689eee0d-6f33-47e0-bb6d-8c23d3cca3d6","Type":"ContainerStarted","Data":"d2de45cd07dbf1cd28d80f4ecec7312dc2e266c1cddb5e1223e4e34a3e07f351"} Feb 23 00:10:35 crc kubenswrapper[4735]: I0223 00:10:35.301546 4735 generic.go:334] "Generic (PLEG): container finished" podID="0e33bc37-4c54-4f50-95b7-bd2cf2da176e" containerID="5b94d40e6164b0ed2e6ce9c8b5d41d1fd2dda247666febe6f678712cc7880357" exitCode=0 Feb 23 00:10:35 crc kubenswrapper[4735]: I0223 00:10:35.301591 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-74kxk" event={"ID":"0e33bc37-4c54-4f50-95b7-bd2cf2da176e","Type":"ContainerDied","Data":"5b94d40e6164b0ed2e6ce9c8b5d41d1fd2dda247666febe6f678712cc7880357"} Feb 23 00:10:36 crc kubenswrapper[4735]: I0223 00:10:36.307897 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5xl8z" event={"ID":"6b4dd4f4-5ead-4f6f-a993-34effb6df863","Type":"ContainerStarted","Data":"b292cf6aa732ca4e252a8296bae3efff4a11e17e9228ed5790004eceb031be7b"} Feb 23 00:10:36 crc kubenswrapper[4735]: I0223 00:10:36.315508 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-74kxk" event={"ID":"0e33bc37-4c54-4f50-95b7-bd2cf2da176e","Type":"ContainerStarted","Data":"34133bc33001e803294fef5a1e7de9bb302dd8d3c3eb47a331df115da0f95447"} Feb 23 00:10:36 crc kubenswrapper[4735]: I0223 00:10:36.324467 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5xl8z" podStartSLOduration=3.340948769 podStartE2EDuration="44.324436986s" podCreationTimestamp="2026-02-23 00:09:52 +0000 UTC" firstStartedPulling="2026-02-23 00:09:54.76182085 +0000 UTC m=+153.225366821" lastFinishedPulling="2026-02-23 00:10:35.745309067 +0000 UTC m=+194.208855038" observedRunningTime="2026-02-23 00:10:36.323249837 +0000 UTC m=+194.786795808" watchObservedRunningTime="2026-02-23 00:10:36.324436986 +0000 UTC m=+194.787982957" Feb 23 00:10:36 crc kubenswrapper[4735]: I0223 00:10:36.338831 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-74kxk" podStartSLOduration=3.400445601 podStartE2EDuration="45.338809262s" podCreationTimestamp="2026-02-23 00:09:51 +0000 UTC" firstStartedPulling="2026-02-23 00:09:53.741547411 +0000 UTC m=+152.205093382" lastFinishedPulling="2026-02-23 00:10:35.679911072 +0000 UTC m=+194.143457043" observedRunningTime="2026-02-23 00:10:36.338557646 +0000 UTC m=+194.802103617" watchObservedRunningTime="2026-02-23 00:10:36.338809262 +0000 UTC m=+194.802355243" Feb 23 00:10:36 crc kubenswrapper[4735]: I0223 00:10:36.682976 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 00:10:36 crc kubenswrapper[4735]: I0223 00:10:36.877097 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/689eee0d-6f33-47e0-bb6d-8c23d3cca3d6-kubelet-dir\") pod \"689eee0d-6f33-47e0-bb6d-8c23d3cca3d6\" (UID: \"689eee0d-6f33-47e0-bb6d-8c23d3cca3d6\") " Feb 23 00:10:36 crc kubenswrapper[4735]: I0223 00:10:36.877166 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/689eee0d-6f33-47e0-bb6d-8c23d3cca3d6-kube-api-access\") pod \"689eee0d-6f33-47e0-bb6d-8c23d3cca3d6\" (UID: \"689eee0d-6f33-47e0-bb6d-8c23d3cca3d6\") " Feb 23 00:10:36 crc kubenswrapper[4735]: I0223 00:10:36.877198 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/689eee0d-6f33-47e0-bb6d-8c23d3cca3d6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "689eee0d-6f33-47e0-bb6d-8c23d3cca3d6" (UID: "689eee0d-6f33-47e0-bb6d-8c23d3cca3d6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 00:10:36 crc kubenswrapper[4735]: I0223 00:10:36.877403 4735 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/689eee0d-6f33-47e0-bb6d-8c23d3cca3d6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:36 crc kubenswrapper[4735]: I0223 00:10:36.884036 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/689eee0d-6f33-47e0-bb6d-8c23d3cca3d6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "689eee0d-6f33-47e0-bb6d-8c23d3cca3d6" (UID: "689eee0d-6f33-47e0-bb6d-8c23d3cca3d6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:10:36 crc kubenswrapper[4735]: I0223 00:10:36.978597 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/689eee0d-6f33-47e0-bb6d-8c23d3cca3d6-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:37 crc kubenswrapper[4735]: I0223 00:10:37.322610 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"689eee0d-6f33-47e0-bb6d-8c23d3cca3d6","Type":"ContainerDied","Data":"d2de45cd07dbf1cd28d80f4ecec7312dc2e266c1cddb5e1223e4e34a3e07f351"} Feb 23 00:10:37 crc kubenswrapper[4735]: I0223 00:10:37.322648 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2de45cd07dbf1cd28d80f4ecec7312dc2e266c1cddb5e1223e4e34a3e07f351" Feb 23 00:10:37 crc kubenswrapper[4735]: I0223 00:10:37.322696 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 23 00:10:39 crc kubenswrapper[4735]: I0223 00:10:39.650328 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 23 00:10:39 crc kubenswrapper[4735]: E0223 00:10:39.650838 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="689eee0d-6f33-47e0-bb6d-8c23d3cca3d6" containerName="pruner" Feb 23 00:10:39 crc kubenswrapper[4735]: I0223 00:10:39.650876 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="689eee0d-6f33-47e0-bb6d-8c23d3cca3d6" containerName="pruner" Feb 23 00:10:39 crc kubenswrapper[4735]: I0223 00:10:39.651003 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="689eee0d-6f33-47e0-bb6d-8c23d3cca3d6" containerName="pruner" Feb 23 00:10:39 crc kubenswrapper[4735]: I0223 00:10:39.651418 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 23 00:10:39 crc kubenswrapper[4735]: I0223 00:10:39.658578 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 23 00:10:39 crc kubenswrapper[4735]: I0223 00:10:39.658619 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 23 00:10:39 crc kubenswrapper[4735]: I0223 00:10:39.670050 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 23 00:10:39 crc kubenswrapper[4735]: I0223 00:10:39.713846 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5fd08331-c55b-4c64-b731-546ab66801e7-var-lock\") pod \"installer-9-crc\" (UID: \"5fd08331-c55b-4c64-b731-546ab66801e7\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 23 00:10:39 crc kubenswrapper[4735]: I0223 00:10:39.714203 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5fd08331-c55b-4c64-b731-546ab66801e7-kube-api-access\") pod \"installer-9-crc\" (UID: \"5fd08331-c55b-4c64-b731-546ab66801e7\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 23 00:10:39 crc kubenswrapper[4735]: I0223 00:10:39.714343 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5fd08331-c55b-4c64-b731-546ab66801e7-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5fd08331-c55b-4c64-b731-546ab66801e7\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 23 00:10:39 crc kubenswrapper[4735]: I0223 00:10:39.815432 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5fd08331-c55b-4c64-b731-546ab66801e7-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5fd08331-c55b-4c64-b731-546ab66801e7\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 23 00:10:39 crc kubenswrapper[4735]: I0223 00:10:39.815555 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5fd08331-c55b-4c64-b731-546ab66801e7-var-lock\") pod \"installer-9-crc\" (UID: \"5fd08331-c55b-4c64-b731-546ab66801e7\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 23 00:10:39 crc kubenswrapper[4735]: I0223 00:10:39.815618 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5fd08331-c55b-4c64-b731-546ab66801e7-kube-api-access\") pod \"installer-9-crc\" (UID: \"5fd08331-c55b-4c64-b731-546ab66801e7\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 23 00:10:39 crc kubenswrapper[4735]: I0223 00:10:39.815640 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5fd08331-c55b-4c64-b731-546ab66801e7-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5fd08331-c55b-4c64-b731-546ab66801e7\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 23 00:10:39 crc kubenswrapper[4735]: I0223 00:10:39.815730 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5fd08331-c55b-4c64-b731-546ab66801e7-var-lock\") pod \"installer-9-crc\" (UID: \"5fd08331-c55b-4c64-b731-546ab66801e7\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 23 00:10:39 crc kubenswrapper[4735]: I0223 00:10:39.836595 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5fd08331-c55b-4c64-b731-546ab66801e7-kube-api-access\") pod \"installer-9-crc\" (UID: \"5fd08331-c55b-4c64-b731-546ab66801e7\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 23 00:10:39 crc kubenswrapper[4735]: I0223 00:10:39.976553 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 23 00:10:40 crc kubenswrapper[4735]: I0223 00:10:40.498330 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 23 00:10:40 crc kubenswrapper[4735]: W0223 00:10:40.507393 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5fd08331_c55b_4c64_b731_546ab66801e7.slice/crio-55549792dee1248da2cfdece269052d12fa6f42bcda807801ee5b2e7a8fc7b9f WatchSource:0}: Error finding container 55549792dee1248da2cfdece269052d12fa6f42bcda807801ee5b2e7a8fc7b9f: Status 404 returned error can't find the container with id 55549792dee1248da2cfdece269052d12fa6f42bcda807801ee5b2e7a8fc7b9f Feb 23 00:10:41 crc kubenswrapper[4735]: I0223 00:10:41.350061 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5fd08331-c55b-4c64-b731-546ab66801e7","Type":"ContainerStarted","Data":"d6e85572ac86fe5803be17e4efe2df9466228f93197373f40c3c8221a1a4c1ae"} Feb 23 00:10:41 crc kubenswrapper[4735]: I0223 00:10:41.350454 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5fd08331-c55b-4c64-b731-546ab66801e7","Type":"ContainerStarted","Data":"55549792dee1248da2cfdece269052d12fa6f42bcda807801ee5b2e7a8fc7b9f"} Feb 23 00:10:41 crc kubenswrapper[4735]: I0223 00:10:41.377235 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.377208274 podStartE2EDuration="2.377208274s" podCreationTimestamp="2026-02-23 00:10:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:10:41.373362889 +0000 UTC m=+199.836908900" watchObservedRunningTime="2026-02-23 00:10:41.377208274 +0000 UTC m=+199.840754285" Feb 23 00:10:41 crc kubenswrapper[4735]: I0223 00:10:41.512436 4735 patch_prober.go:28] interesting pod/machine-config-daemon-blmnv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 00:10:41 crc kubenswrapper[4735]: I0223 00:10:41.512514 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" podUID="1cba474f-2d55-4a07-969f-25e2817a06d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 00:10:42 crc kubenswrapper[4735]: I0223 00:10:42.289257 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-74kxk" Feb 23 00:10:42 crc kubenswrapper[4735]: I0223 00:10:42.289325 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-74kxk" Feb 23 00:10:42 crc kubenswrapper[4735]: I0223 00:10:42.332548 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-74kxk" Feb 23 00:10:42 crc kubenswrapper[4735]: I0223 00:10:42.422897 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-74kxk" Feb 23 00:10:42 crc kubenswrapper[4735]: I0223 00:10:42.573221 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mvwmg" Feb 23 00:10:42 crc kubenswrapper[4735]: I0223 00:10:42.573319 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mvwmg" Feb 23 00:10:42 crc kubenswrapper[4735]: I0223 00:10:42.640123 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mvwmg" Feb 23 00:10:43 crc kubenswrapper[4735]: I0223 00:10:43.061020 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5xl8z" Feb 23 00:10:43 crc kubenswrapper[4735]: I0223 00:10:43.061393 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5xl8z" Feb 23 00:10:43 crc kubenswrapper[4735]: I0223 00:10:43.112927 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5xl8z" Feb 23 00:10:43 crc kubenswrapper[4735]: I0223 00:10:43.394023 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5xl8z" Feb 23 00:10:43 crc kubenswrapper[4735]: I0223 00:10:43.403629 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mvwmg" Feb 23 00:10:44 crc kubenswrapper[4735]: I0223 00:10:44.377744 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5xl8z"] Feb 23 00:10:46 crc kubenswrapper[4735]: I0223 00:10:46.132400 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" podUID="cf538d4f-1acd-4e61-9827-7430d1099138" containerName="oauth-openshift" containerID="cri-o://abc155ad6df61fdd08392e424628e1cee5325189224dbd35ce2627b3683726ae" gracePeriod=15 Feb 23 00:10:46 crc kubenswrapper[4735]: I0223 00:10:46.139959 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-cb6b46446-77nq4"] Feb 23 00:10:46 crc kubenswrapper[4735]: I0223 00:10:46.140266 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-cb6b46446-77nq4" podUID="756348da-76db-4458-a280-488a6c79756a" containerName="controller-manager" containerID="cri-o://704c75a1c565a4f56b1ecfc0889f91c9e80230550fe1b31c34f734d79de23206" gracePeriod=30 Feb 23 00:10:46 crc kubenswrapper[4735]: I0223 00:10:46.177747 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-597ccdd7cc-vbgfb"] Feb 23 00:10:46 crc kubenswrapper[4735]: I0223 00:10:46.178103 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-597ccdd7cc-vbgfb" podUID="78e0db83-6432-4c4c-bff1-e5134760124e" containerName="route-controller-manager" containerID="cri-o://46d1076fb3c645a22bde2440950984f2d329f8b483459c85ff00efacec7c619e" gracePeriod=30 Feb 23 00:10:46 crc kubenswrapper[4735]: I0223 00:10:46.380018 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5xl8z" podUID="6b4dd4f4-5ead-4f6f-a993-34effb6df863" containerName="registry-server" containerID="cri-o://b292cf6aa732ca4e252a8296bae3efff4a11e17e9228ed5790004eceb031be7b" gracePeriod=2 Feb 23 00:10:46 crc kubenswrapper[4735]: I0223 00:10:46.784023 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-74kxk"] Feb 23 00:10:46 crc kubenswrapper[4735]: I0223 00:10:46.784454 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-74kxk" podUID="0e33bc37-4c54-4f50-95b7-bd2cf2da176e" containerName="registry-server" containerID="cri-o://34133bc33001e803294fef5a1e7de9bb302dd8d3c3eb47a331df115da0f95447" gracePeriod=2 Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.182913 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cb6b46446-77nq4" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.233907 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7cc59c78dd-gv49x"] Feb 23 00:10:47 crc kubenswrapper[4735]: E0223 00:10:47.234158 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="756348da-76db-4458-a280-488a6c79756a" containerName="controller-manager" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.234173 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="756348da-76db-4458-a280-488a6c79756a" containerName="controller-manager" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.234306 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="756348da-76db-4458-a280-488a6c79756a" containerName="controller-manager" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.237643 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cc59c78dd-gv49x" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.247461 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7cc59c78dd-gv49x"] Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.320280 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/756348da-76db-4458-a280-488a6c79756a-serving-cert\") pod \"756348da-76db-4458-a280-488a6c79756a\" (UID: \"756348da-76db-4458-a280-488a6c79756a\") " Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.320355 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/756348da-76db-4458-a280-488a6c79756a-proxy-ca-bundles\") pod \"756348da-76db-4458-a280-488a6c79756a\" (UID: \"756348da-76db-4458-a280-488a6c79756a\") " Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.320415 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/756348da-76db-4458-a280-488a6c79756a-client-ca\") pod \"756348da-76db-4458-a280-488a6c79756a\" (UID: \"756348da-76db-4458-a280-488a6c79756a\") " Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.320540 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/756348da-76db-4458-a280-488a6c79756a-config\") pod \"756348da-76db-4458-a280-488a6c79756a\" (UID: \"756348da-76db-4458-a280-488a6c79756a\") " Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.320570 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8shrv\" (UniqueName: \"kubernetes.io/projected/756348da-76db-4458-a280-488a6c79756a-kube-api-access-8shrv\") pod \"756348da-76db-4458-a280-488a6c79756a\" (UID: \"756348da-76db-4458-a280-488a6c79756a\") " Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.321340 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/756348da-76db-4458-a280-488a6c79756a-client-ca" (OuterVolumeSpecName: "client-ca") pod "756348da-76db-4458-a280-488a6c79756a" (UID: "756348da-76db-4458-a280-488a6c79756a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.321452 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/756348da-76db-4458-a280-488a6c79756a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "756348da-76db-4458-a280-488a6c79756a" (UID: "756348da-76db-4458-a280-488a6c79756a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.321842 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/756348da-76db-4458-a280-488a6c79756a-config" (OuterVolumeSpecName: "config") pod "756348da-76db-4458-a280-488a6c79756a" (UID: "756348da-76db-4458-a280-488a6c79756a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.326112 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/756348da-76db-4458-a280-488a6c79756a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "756348da-76db-4458-a280-488a6c79756a" (UID: "756348da-76db-4458-a280-488a6c79756a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.326706 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/756348da-76db-4458-a280-488a6c79756a-kube-api-access-8shrv" (OuterVolumeSpecName: "kube-api-access-8shrv") pod "756348da-76db-4458-a280-488a6c79756a" (UID: "756348da-76db-4458-a280-488a6c79756a"). InnerVolumeSpecName "kube-api-access-8shrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.394517 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-597ccdd7cc-vbgfb" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.395682 4735 generic.go:334] "Generic (PLEG): container finished" podID="0e33bc37-4c54-4f50-95b7-bd2cf2da176e" containerID="34133bc33001e803294fef5a1e7de9bb302dd8d3c3eb47a331df115da0f95447" exitCode=0 Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.395769 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-74kxk" event={"ID":"0e33bc37-4c54-4f50-95b7-bd2cf2da176e","Type":"ContainerDied","Data":"34133bc33001e803294fef5a1e7de9bb302dd8d3c3eb47a331df115da0f95447"} Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.396334 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-74kxk" event={"ID":"0e33bc37-4c54-4f50-95b7-bd2cf2da176e","Type":"ContainerDied","Data":"b8fef3900c6546e311dd961fa0a0c81f83d4c53e096975eab1990148820d32ac"} Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.396367 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8fef3900c6546e311dd961fa0a0c81f83d4c53e096975eab1990148820d32ac" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.397930 4735 generic.go:334] "Generic (PLEG): container finished" podID="78e0db83-6432-4c4c-bff1-e5134760124e" containerID="46d1076fb3c645a22bde2440950984f2d329f8b483459c85ff00efacec7c619e" exitCode=0 Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.397969 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-597ccdd7cc-vbgfb" event={"ID":"78e0db83-6432-4c4c-bff1-e5134760124e","Type":"ContainerDied","Data":"46d1076fb3c645a22bde2440950984f2d329f8b483459c85ff00efacec7c619e"} Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.397993 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-597ccdd7cc-vbgfb" event={"ID":"78e0db83-6432-4c4c-bff1-e5134760124e","Type":"ContainerDied","Data":"9e97097d3b2f1f8c5152a8e4344ef9c74a7d230473dd06c9e7b1b103e0482b74"} Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.397999 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-597ccdd7cc-vbgfb" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.398040 4735 scope.go:117] "RemoveContainer" containerID="46d1076fb3c645a22bde2440950984f2d329f8b483459c85ff00efacec7c619e" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.398819 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-74kxk" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.399530 4735 generic.go:334] "Generic (PLEG): container finished" podID="cf538d4f-1acd-4e61-9827-7430d1099138" containerID="abc155ad6df61fdd08392e424628e1cee5325189224dbd35ce2627b3683726ae" exitCode=0 Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.399575 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" event={"ID":"cf538d4f-1acd-4e61-9827-7430d1099138","Type":"ContainerDied","Data":"abc155ad6df61fdd08392e424628e1cee5325189224dbd35ce2627b3683726ae"} Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.399590 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" event={"ID":"cf538d4f-1acd-4e61-9827-7430d1099138","Type":"ContainerDied","Data":"36dbe32caa78c08a4de251b13d878b7abdd28dcb0ec2dcf42ff6beb905c768d7"} Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.399604 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36dbe32caa78c08a4de251b13d878b7abdd28dcb0ec2dcf42ff6beb905c768d7" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.406289 4735 generic.go:334] "Generic (PLEG): container finished" podID="756348da-76db-4458-a280-488a6c79756a" containerID="704c75a1c565a4f56b1ecfc0889f91c9e80230550fe1b31c34f734d79de23206" exitCode=0 Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.406366 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cb6b46446-77nq4" event={"ID":"756348da-76db-4458-a280-488a6c79756a","Type":"ContainerDied","Data":"704c75a1c565a4f56b1ecfc0889f91c9e80230550fe1b31c34f734d79de23206"} Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.406401 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cb6b46446-77nq4" event={"ID":"756348da-76db-4458-a280-488a6c79756a","Type":"ContainerDied","Data":"ed8af5d1625b7f79bdb98f6c6340a5d96e29e7946ff573099b77ded79b50d304"} Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.406474 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cb6b46446-77nq4" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.408012 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.409904 4735 generic.go:334] "Generic (PLEG): container finished" podID="6b4dd4f4-5ead-4f6f-a993-34effb6df863" containerID="b292cf6aa732ca4e252a8296bae3efff4a11e17e9228ed5790004eceb031be7b" exitCode=0 Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.409949 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5xl8z" event={"ID":"6b4dd4f4-5ead-4f6f-a993-34effb6df863","Type":"ContainerDied","Data":"b292cf6aa732ca4e252a8296bae3efff4a11e17e9228ed5790004eceb031be7b"} Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.415422 4735 scope.go:117] "RemoveContainer" containerID="46d1076fb3c645a22bde2440950984f2d329f8b483459c85ff00efacec7c619e" Feb 23 00:10:47 crc kubenswrapper[4735]: E0223 00:10:47.416119 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46d1076fb3c645a22bde2440950984f2d329f8b483459c85ff00efacec7c619e\": container with ID starting with 46d1076fb3c645a22bde2440950984f2d329f8b483459c85ff00efacec7c619e not found: ID does not exist" containerID="46d1076fb3c645a22bde2440950984f2d329f8b483459c85ff00efacec7c619e" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.416158 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46d1076fb3c645a22bde2440950984f2d329f8b483459c85ff00efacec7c619e"} err="failed to get container status \"46d1076fb3c645a22bde2440950984f2d329f8b483459c85ff00efacec7c619e\": rpc error: code = NotFound desc = could not find container \"46d1076fb3c645a22bde2440950984f2d329f8b483459c85ff00efacec7c619e\": container with ID starting with 46d1076fb3c645a22bde2440950984f2d329f8b483459c85ff00efacec7c619e not found: ID does not exist" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.416186 4735 scope.go:117] "RemoveContainer" containerID="704c75a1c565a4f56b1ecfc0889f91c9e80230550fe1b31c34f734d79de23206" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.421603 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9ec6d401-e9a0-4981-9f41-3c07d8bda018-proxy-ca-bundles\") pod \"controller-manager-7cc59c78dd-gv49x\" (UID: \"9ec6d401-e9a0-4981-9f41-3c07d8bda018\") " pod="openshift-controller-manager/controller-manager-7cc59c78dd-gv49x" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.421649 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9ec6d401-e9a0-4981-9f41-3c07d8bda018-client-ca\") pod \"controller-manager-7cc59c78dd-gv49x\" (UID: \"9ec6d401-e9a0-4981-9f41-3c07d8bda018\") " pod="openshift-controller-manager/controller-manager-7cc59c78dd-gv49x" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.421671 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ec6d401-e9a0-4981-9f41-3c07d8bda018-serving-cert\") pod \"controller-manager-7cc59c78dd-gv49x\" (UID: \"9ec6d401-e9a0-4981-9f41-3c07d8bda018\") " pod="openshift-controller-manager/controller-manager-7cc59c78dd-gv49x" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.421914 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck99h\" (UniqueName: \"kubernetes.io/projected/9ec6d401-e9a0-4981-9f41-3c07d8bda018-kube-api-access-ck99h\") pod \"controller-manager-7cc59c78dd-gv49x\" (UID: \"9ec6d401-e9a0-4981-9f41-3c07d8bda018\") " pod="openshift-controller-manager/controller-manager-7cc59c78dd-gv49x" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.421996 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ec6d401-e9a0-4981-9f41-3c07d8bda018-config\") pod \"controller-manager-7cc59c78dd-gv49x\" (UID: \"9ec6d401-e9a0-4981-9f41-3c07d8bda018\") " pod="openshift-controller-manager/controller-manager-7cc59c78dd-gv49x" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.422074 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/756348da-76db-4458-a280-488a6c79756a-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.422090 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8shrv\" (UniqueName: \"kubernetes.io/projected/756348da-76db-4458-a280-488a6c79756a-kube-api-access-8shrv\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.422103 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/756348da-76db-4458-a280-488a6c79756a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.422116 4735 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/756348da-76db-4458-a280-488a6c79756a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.422127 4735 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/756348da-76db-4458-a280-488a6c79756a-client-ca\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.422472 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5xl8z" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.435680 4735 scope.go:117] "RemoveContainer" containerID="704c75a1c565a4f56b1ecfc0889f91c9e80230550fe1b31c34f734d79de23206" Feb 23 00:10:47 crc kubenswrapper[4735]: E0223 00:10:47.436181 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"704c75a1c565a4f56b1ecfc0889f91c9e80230550fe1b31c34f734d79de23206\": container with ID starting with 704c75a1c565a4f56b1ecfc0889f91c9e80230550fe1b31c34f734d79de23206 not found: ID does not exist" containerID="704c75a1c565a4f56b1ecfc0889f91c9e80230550fe1b31c34f734d79de23206" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.436225 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"704c75a1c565a4f56b1ecfc0889f91c9e80230550fe1b31c34f734d79de23206"} err="failed to get container status \"704c75a1c565a4f56b1ecfc0889f91c9e80230550fe1b31c34f734d79de23206\": rpc error: code = NotFound desc = could not find container \"704c75a1c565a4f56b1ecfc0889f91c9e80230550fe1b31c34f734d79de23206\": container with ID starting with 704c75a1c565a4f56b1ecfc0889f91c9e80230550fe1b31c34f734d79de23206 not found: ID does not exist" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.504205 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-cb6b46446-77nq4"] Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.506873 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-cb6b46446-77nq4"] Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.523320 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78e0db83-6432-4c4c-bff1-e5134760124e-serving-cert\") pod \"78e0db83-6432-4c4c-bff1-e5134760124e\" (UID: \"78e0db83-6432-4c4c-bff1-e5134760124e\") " Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.523356 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-router-certs\") pod \"cf538d4f-1acd-4e61-9827-7430d1099138\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.523388 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7snw\" (UniqueName: \"kubernetes.io/projected/78e0db83-6432-4c4c-bff1-e5134760124e-kube-api-access-d7snw\") pod \"78e0db83-6432-4c4c-bff1-e5134760124e\" (UID: \"78e0db83-6432-4c4c-bff1-e5134760124e\") " Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.523406 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-user-template-error\") pod \"cf538d4f-1acd-4e61-9827-7430d1099138\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.523423 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8l4rw\" (UniqueName: \"kubernetes.io/projected/cf538d4f-1acd-4e61-9827-7430d1099138-kube-api-access-8l4rw\") pod \"cf538d4f-1acd-4e61-9827-7430d1099138\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.523443 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-cliconfig\") pod \"cf538d4f-1acd-4e61-9827-7430d1099138\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.523458 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-session\") pod \"cf538d4f-1acd-4e61-9827-7430d1099138\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.523477 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-serving-cert\") pod \"cf538d4f-1acd-4e61-9827-7430d1099138\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.523494 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25n9n\" (UniqueName: \"kubernetes.io/projected/0e33bc37-4c54-4f50-95b7-bd2cf2da176e-kube-api-access-25n9n\") pod \"0e33bc37-4c54-4f50-95b7-bd2cf2da176e\" (UID: \"0e33bc37-4c54-4f50-95b7-bd2cf2da176e\") " Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.523509 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b4dd4f4-5ead-4f6f-a993-34effb6df863-utilities\") pod \"6b4dd4f4-5ead-4f6f-a993-34effb6df863\" (UID: \"6b4dd4f4-5ead-4f6f-a993-34effb6df863\") " Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.523527 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-user-template-provider-selection\") pod \"cf538d4f-1acd-4e61-9827-7430d1099138\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.523549 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-user-idp-0-file-data\") pod \"cf538d4f-1acd-4e61-9827-7430d1099138\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.523568 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e33bc37-4c54-4f50-95b7-bd2cf2da176e-catalog-content\") pod \"0e33bc37-4c54-4f50-95b7-bd2cf2da176e\" (UID: \"0e33bc37-4c54-4f50-95b7-bd2cf2da176e\") " Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.523583 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b4dd4f4-5ead-4f6f-a993-34effb6df863-catalog-content\") pod \"6b4dd4f4-5ead-4f6f-a993-34effb6df863\" (UID: \"6b4dd4f4-5ead-4f6f-a993-34effb6df863\") " Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.523607 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-user-template-login\") pod \"cf538d4f-1acd-4e61-9827-7430d1099138\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.523624 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78e0db83-6432-4c4c-bff1-e5134760124e-config\") pod \"78e0db83-6432-4c4c-bff1-e5134760124e\" (UID: \"78e0db83-6432-4c4c-bff1-e5134760124e\") " Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.523640 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-service-ca\") pod \"cf538d4f-1acd-4e61-9827-7430d1099138\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.523693 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-ocp-branding-template\") pod \"cf538d4f-1acd-4e61-9827-7430d1099138\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.523708 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cf538d4f-1acd-4e61-9827-7430d1099138-audit-dir\") pod \"cf538d4f-1acd-4e61-9827-7430d1099138\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.523723 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e33bc37-4c54-4f50-95b7-bd2cf2da176e-utilities\") pod \"0e33bc37-4c54-4f50-95b7-bd2cf2da176e\" (UID: \"0e33bc37-4c54-4f50-95b7-bd2cf2da176e\") " Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.523743 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kntm8\" (UniqueName: \"kubernetes.io/projected/6b4dd4f4-5ead-4f6f-a993-34effb6df863-kube-api-access-kntm8\") pod \"6b4dd4f4-5ead-4f6f-a993-34effb6df863\" (UID: \"6b4dd4f4-5ead-4f6f-a993-34effb6df863\") " Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.523783 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cf538d4f-1acd-4e61-9827-7430d1099138-audit-policies\") pod \"cf538d4f-1acd-4e61-9827-7430d1099138\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.523799 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-trusted-ca-bundle\") pod \"cf538d4f-1acd-4e61-9827-7430d1099138\" (UID: \"cf538d4f-1acd-4e61-9827-7430d1099138\") " Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.523841 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78e0db83-6432-4c4c-bff1-e5134760124e-client-ca\") pod \"78e0db83-6432-4c4c-bff1-e5134760124e\" (UID: \"78e0db83-6432-4c4c-bff1-e5134760124e\") " Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.523964 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9ec6d401-e9a0-4981-9f41-3c07d8bda018-proxy-ca-bundles\") pod \"controller-manager-7cc59c78dd-gv49x\" (UID: \"9ec6d401-e9a0-4981-9f41-3c07d8bda018\") " pod="openshift-controller-manager/controller-manager-7cc59c78dd-gv49x" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.523992 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9ec6d401-e9a0-4981-9f41-3c07d8bda018-client-ca\") pod \"controller-manager-7cc59c78dd-gv49x\" (UID: \"9ec6d401-e9a0-4981-9f41-3c07d8bda018\") " pod="openshift-controller-manager/controller-manager-7cc59c78dd-gv49x" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.524012 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ec6d401-e9a0-4981-9f41-3c07d8bda018-serving-cert\") pod \"controller-manager-7cc59c78dd-gv49x\" (UID: \"9ec6d401-e9a0-4981-9f41-3c07d8bda018\") " pod="openshift-controller-manager/controller-manager-7cc59c78dd-gv49x" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.524055 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck99h\" (UniqueName: \"kubernetes.io/projected/9ec6d401-e9a0-4981-9f41-3c07d8bda018-kube-api-access-ck99h\") pod \"controller-manager-7cc59c78dd-gv49x\" (UID: \"9ec6d401-e9a0-4981-9f41-3c07d8bda018\") " pod="openshift-controller-manager/controller-manager-7cc59c78dd-gv49x" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.524083 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ec6d401-e9a0-4981-9f41-3c07d8bda018-config\") pod \"controller-manager-7cc59c78dd-gv49x\" (UID: \"9ec6d401-e9a0-4981-9f41-3c07d8bda018\") " pod="openshift-controller-manager/controller-manager-7cc59c78dd-gv49x" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.524969 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9ec6d401-e9a0-4981-9f41-3c07d8bda018-client-ca\") pod \"controller-manager-7cc59c78dd-gv49x\" (UID: \"9ec6d401-e9a0-4981-9f41-3c07d8bda018\") " pod="openshift-controller-manager/controller-manager-7cc59c78dd-gv49x" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.525038 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "cf538d4f-1acd-4e61-9827-7430d1099138" (UID: "cf538d4f-1acd-4e61-9827-7430d1099138"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.525095 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cf538d4f-1acd-4e61-9827-7430d1099138-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "cf538d4f-1acd-4e61-9827-7430d1099138" (UID: "cf538d4f-1acd-4e61-9827-7430d1099138"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.525416 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ec6d401-e9a0-4981-9f41-3c07d8bda018-config\") pod \"controller-manager-7cc59c78dd-gv49x\" (UID: \"9ec6d401-e9a0-4981-9f41-3c07d8bda018\") " pod="openshift-controller-manager/controller-manager-7cc59c78dd-gv49x" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.525669 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "cf538d4f-1acd-4e61-9827-7430d1099138" (UID: "cf538d4f-1acd-4e61-9827-7430d1099138"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.526100 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e33bc37-4c54-4f50-95b7-bd2cf2da176e-utilities" (OuterVolumeSpecName: "utilities") pod "0e33bc37-4c54-4f50-95b7-bd2cf2da176e" (UID: "0e33bc37-4c54-4f50-95b7-bd2cf2da176e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.526301 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78e0db83-6432-4c4c-bff1-e5134760124e-client-ca" (OuterVolumeSpecName: "client-ca") pod "78e0db83-6432-4c4c-bff1-e5134760124e" (UID: "78e0db83-6432-4c4c-bff1-e5134760124e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.526344 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78e0db83-6432-4c4c-bff1-e5134760124e-config" (OuterVolumeSpecName: "config") pod "78e0db83-6432-4c4c-bff1-e5134760124e" (UID: "78e0db83-6432-4c4c-bff1-e5134760124e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.526814 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9ec6d401-e9a0-4981-9f41-3c07d8bda018-proxy-ca-bundles\") pod \"controller-manager-7cc59c78dd-gv49x\" (UID: \"9ec6d401-e9a0-4981-9f41-3c07d8bda018\") " pod="openshift-controller-manager/controller-manager-7cc59c78dd-gv49x" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.526919 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78e0db83-6432-4c4c-bff1-e5134760124e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "78e0db83-6432-4c4c-bff1-e5134760124e" (UID: "78e0db83-6432-4c4c-bff1-e5134760124e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.527310 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf538d4f-1acd-4e61-9827-7430d1099138-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "cf538d4f-1acd-4e61-9827-7430d1099138" (UID: "cf538d4f-1acd-4e61-9827-7430d1099138"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.529385 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "cf538d4f-1acd-4e61-9827-7430d1099138" (UID: "cf538d4f-1acd-4e61-9827-7430d1099138"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.530066 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b4dd4f4-5ead-4f6f-a993-34effb6df863-utilities" (OuterVolumeSpecName: "utilities") pod "6b4dd4f4-5ead-4f6f-a993-34effb6df863" (UID: "6b4dd4f4-5ead-4f6f-a993-34effb6df863"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.530508 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ec6d401-e9a0-4981-9f41-3c07d8bda018-serving-cert\") pod \"controller-manager-7cc59c78dd-gv49x\" (UID: \"9ec6d401-e9a0-4981-9f41-3c07d8bda018\") " pod="openshift-controller-manager/controller-manager-7cc59c78dd-gv49x" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.530806 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf538d4f-1acd-4e61-9827-7430d1099138-kube-api-access-8l4rw" (OuterVolumeSpecName: "kube-api-access-8l4rw") pod "cf538d4f-1acd-4e61-9827-7430d1099138" (UID: "cf538d4f-1acd-4e61-9827-7430d1099138"). InnerVolumeSpecName "kube-api-access-8l4rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.530887 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "cf538d4f-1acd-4e61-9827-7430d1099138" (UID: "cf538d4f-1acd-4e61-9827-7430d1099138"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.531293 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "cf538d4f-1acd-4e61-9827-7430d1099138" (UID: "cf538d4f-1acd-4e61-9827-7430d1099138"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.531386 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "cf538d4f-1acd-4e61-9827-7430d1099138" (UID: "cf538d4f-1acd-4e61-9827-7430d1099138"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.531405 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e33bc37-4c54-4f50-95b7-bd2cf2da176e-kube-api-access-25n9n" (OuterVolumeSpecName: "kube-api-access-25n9n") pod "0e33bc37-4c54-4f50-95b7-bd2cf2da176e" (UID: "0e33bc37-4c54-4f50-95b7-bd2cf2da176e"). InnerVolumeSpecName "kube-api-access-25n9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.531650 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "cf538d4f-1acd-4e61-9827-7430d1099138" (UID: "cf538d4f-1acd-4e61-9827-7430d1099138"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.531932 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "cf538d4f-1acd-4e61-9827-7430d1099138" (UID: "cf538d4f-1acd-4e61-9827-7430d1099138"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.532303 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "cf538d4f-1acd-4e61-9827-7430d1099138" (UID: "cf538d4f-1acd-4e61-9827-7430d1099138"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.532645 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78e0db83-6432-4c4c-bff1-e5134760124e-kube-api-access-d7snw" (OuterVolumeSpecName: "kube-api-access-d7snw") pod "78e0db83-6432-4c4c-bff1-e5134760124e" (UID: "78e0db83-6432-4c4c-bff1-e5134760124e"). InnerVolumeSpecName "kube-api-access-d7snw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.532950 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b4dd4f4-5ead-4f6f-a993-34effb6df863-kube-api-access-kntm8" (OuterVolumeSpecName: "kube-api-access-kntm8") pod "6b4dd4f4-5ead-4f6f-a993-34effb6df863" (UID: "6b4dd4f4-5ead-4f6f-a993-34effb6df863"). InnerVolumeSpecName "kube-api-access-kntm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.533487 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "cf538d4f-1acd-4e61-9827-7430d1099138" (UID: "cf538d4f-1acd-4e61-9827-7430d1099138"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.539423 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "cf538d4f-1acd-4e61-9827-7430d1099138" (UID: "cf538d4f-1acd-4e61-9827-7430d1099138"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.542763 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck99h\" (UniqueName: \"kubernetes.io/projected/9ec6d401-e9a0-4981-9f41-3c07d8bda018-kube-api-access-ck99h\") pod \"controller-manager-7cc59c78dd-gv49x\" (UID: \"9ec6d401-e9a0-4981-9f41-3c07d8bda018\") " pod="openshift-controller-manager/controller-manager-7cc59c78dd-gv49x" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.547381 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e33bc37-4c54-4f50-95b7-bd2cf2da176e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e33bc37-4c54-4f50-95b7-bd2cf2da176e" (UID: "0e33bc37-4c54-4f50-95b7-bd2cf2da176e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.625675 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e33bc37-4c54-4f50-95b7-bd2cf2da176e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.626142 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.626265 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.626375 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78e0db83-6432-4c4c-bff1-e5134760124e-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.626469 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.626561 4735 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cf538d4f-1acd-4e61-9827-7430d1099138-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.626643 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e33bc37-4c54-4f50-95b7-bd2cf2da176e-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.626723 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.626804 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kntm8\" (UniqueName: \"kubernetes.io/projected/6b4dd4f4-5ead-4f6f-a993-34effb6df863-kube-api-access-kntm8\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.626938 4735 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cf538d4f-1acd-4e61-9827-7430d1099138-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.627025 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.627112 4735 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/78e0db83-6432-4c4c-bff1-e5134760124e-client-ca\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.627210 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/78e0db83-6432-4c4c-bff1-e5134760124e-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.627304 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.627395 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7snw\" (UniqueName: \"kubernetes.io/projected/78e0db83-6432-4c4c-bff1-e5134760124e-kube-api-access-d7snw\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.627488 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.627573 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8l4rw\" (UniqueName: \"kubernetes.io/projected/cf538d4f-1acd-4e61-9827-7430d1099138-kube-api-access-8l4rw\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.627676 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.627789 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.627902 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.628020 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25n9n\" (UniqueName: \"kubernetes.io/projected/0e33bc37-4c54-4f50-95b7-bd2cf2da176e-kube-api-access-25n9n\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.628164 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b4dd4f4-5ead-4f6f-a993-34effb6df863-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.628258 4735 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cf538d4f-1acd-4e61-9827-7430d1099138-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.683922 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b4dd4f4-5ead-4f6f-a993-34effb6df863-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b4dd4f4-5ead-4f6f-a993-34effb6df863" (UID: "6b4dd4f4-5ead-4f6f-a993-34effb6df863"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.684386 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cc59c78dd-gv49x" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.729589 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b4dd4f4-5ead-4f6f-a993-34effb6df863-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.737096 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-597ccdd7cc-vbgfb"] Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.742280 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-597ccdd7cc-vbgfb"] Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.843174 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f64b75456-8lbqh"] Feb 23 00:10:47 crc kubenswrapper[4735]: E0223 00:10:47.843449 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf538d4f-1acd-4e61-9827-7430d1099138" containerName="oauth-openshift" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.843475 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf538d4f-1acd-4e61-9827-7430d1099138" containerName="oauth-openshift" Feb 23 00:10:47 crc kubenswrapper[4735]: E0223 00:10:47.843493 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b4dd4f4-5ead-4f6f-a993-34effb6df863" containerName="registry-server" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.843505 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b4dd4f4-5ead-4f6f-a993-34effb6df863" containerName="registry-server" Feb 23 00:10:47 crc kubenswrapper[4735]: E0223 00:10:47.843521 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e33bc37-4c54-4f50-95b7-bd2cf2da176e" containerName="extract-content" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.843532 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e33bc37-4c54-4f50-95b7-bd2cf2da176e" containerName="extract-content" Feb 23 00:10:47 crc kubenswrapper[4735]: E0223 00:10:47.843545 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b4dd4f4-5ead-4f6f-a993-34effb6df863" containerName="extract-content" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.843557 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b4dd4f4-5ead-4f6f-a993-34effb6df863" containerName="extract-content" Feb 23 00:10:47 crc kubenswrapper[4735]: E0223 00:10:47.843573 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e33bc37-4c54-4f50-95b7-bd2cf2da176e" containerName="extract-utilities" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.843584 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e33bc37-4c54-4f50-95b7-bd2cf2da176e" containerName="extract-utilities" Feb 23 00:10:47 crc kubenswrapper[4735]: E0223 00:10:47.843600 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b4dd4f4-5ead-4f6f-a993-34effb6df863" containerName="extract-utilities" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.843610 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b4dd4f4-5ead-4f6f-a993-34effb6df863" containerName="extract-utilities" Feb 23 00:10:47 crc kubenswrapper[4735]: E0223 00:10:47.843631 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e33bc37-4c54-4f50-95b7-bd2cf2da176e" containerName="registry-server" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.843641 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e33bc37-4c54-4f50-95b7-bd2cf2da176e" containerName="registry-server" Feb 23 00:10:47 crc kubenswrapper[4735]: E0223 00:10:47.843656 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78e0db83-6432-4c4c-bff1-e5134760124e" containerName="route-controller-manager" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.843667 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="78e0db83-6432-4c4c-bff1-e5134760124e" containerName="route-controller-manager" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.843804 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b4dd4f4-5ead-4f6f-a993-34effb6df863" containerName="registry-server" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.843821 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="78e0db83-6432-4c4c-bff1-e5134760124e" containerName="route-controller-manager" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.843835 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e33bc37-4c54-4f50-95b7-bd2cf2da176e" containerName="registry-server" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.843949 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf538d4f-1acd-4e61-9827-7430d1099138" containerName="oauth-openshift" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.844485 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f64b75456-8lbqh" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.847697 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.847786 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.848698 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.848787 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.848956 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.849079 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 23 00:10:47 crc kubenswrapper[4735]: I0223 00:10:47.862131 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f64b75456-8lbqh"] Feb 23 00:10:48 crc kubenswrapper[4735]: I0223 00:10:48.034908 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfd8095e-5e8e-41a0-9086-a0cd90817076-serving-cert\") pod \"route-controller-manager-f64b75456-8lbqh\" (UID: \"dfd8095e-5e8e-41a0-9086-a0cd90817076\") " pod="openshift-route-controller-manager/route-controller-manager-f64b75456-8lbqh" Feb 23 00:10:48 crc kubenswrapper[4735]: I0223 00:10:48.035103 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfd8095e-5e8e-41a0-9086-a0cd90817076-config\") pod \"route-controller-manager-f64b75456-8lbqh\" (UID: \"dfd8095e-5e8e-41a0-9086-a0cd90817076\") " pod="openshift-route-controller-manager/route-controller-manager-f64b75456-8lbqh" Feb 23 00:10:48 crc kubenswrapper[4735]: I0223 00:10:48.035159 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dfd8095e-5e8e-41a0-9086-a0cd90817076-client-ca\") pod \"route-controller-manager-f64b75456-8lbqh\" (UID: \"dfd8095e-5e8e-41a0-9086-a0cd90817076\") " pod="openshift-route-controller-manager/route-controller-manager-f64b75456-8lbqh" Feb 23 00:10:48 crc kubenswrapper[4735]: I0223 00:10:48.035262 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54bpx\" (UniqueName: \"kubernetes.io/projected/dfd8095e-5e8e-41a0-9086-a0cd90817076-kube-api-access-54bpx\") pod \"route-controller-manager-f64b75456-8lbqh\" (UID: \"dfd8095e-5e8e-41a0-9086-a0cd90817076\") " pod="openshift-route-controller-manager/route-controller-manager-f64b75456-8lbqh" Feb 23 00:10:48 crc kubenswrapper[4735]: I0223 00:10:48.136776 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfd8095e-5e8e-41a0-9086-a0cd90817076-serving-cert\") pod \"route-controller-manager-f64b75456-8lbqh\" (UID: \"dfd8095e-5e8e-41a0-9086-a0cd90817076\") " pod="openshift-route-controller-manager/route-controller-manager-f64b75456-8lbqh" Feb 23 00:10:48 crc kubenswrapper[4735]: I0223 00:10:48.136917 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfd8095e-5e8e-41a0-9086-a0cd90817076-config\") pod \"route-controller-manager-f64b75456-8lbqh\" (UID: \"dfd8095e-5e8e-41a0-9086-a0cd90817076\") " pod="openshift-route-controller-manager/route-controller-manager-f64b75456-8lbqh" Feb 23 00:10:48 crc kubenswrapper[4735]: I0223 00:10:48.136958 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dfd8095e-5e8e-41a0-9086-a0cd90817076-client-ca\") pod \"route-controller-manager-f64b75456-8lbqh\" (UID: \"dfd8095e-5e8e-41a0-9086-a0cd90817076\") " pod="openshift-route-controller-manager/route-controller-manager-f64b75456-8lbqh" Feb 23 00:10:48 crc kubenswrapper[4735]: I0223 00:10:48.137005 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54bpx\" (UniqueName: \"kubernetes.io/projected/dfd8095e-5e8e-41a0-9086-a0cd90817076-kube-api-access-54bpx\") pod \"route-controller-manager-f64b75456-8lbqh\" (UID: \"dfd8095e-5e8e-41a0-9086-a0cd90817076\") " pod="openshift-route-controller-manager/route-controller-manager-f64b75456-8lbqh" Feb 23 00:10:48 crc kubenswrapper[4735]: I0223 00:10:48.139204 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dfd8095e-5e8e-41a0-9086-a0cd90817076-client-ca\") pod \"route-controller-manager-f64b75456-8lbqh\" (UID: \"dfd8095e-5e8e-41a0-9086-a0cd90817076\") " pod="openshift-route-controller-manager/route-controller-manager-f64b75456-8lbqh" Feb 23 00:10:48 crc kubenswrapper[4735]: I0223 00:10:48.139628 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfd8095e-5e8e-41a0-9086-a0cd90817076-config\") pod \"route-controller-manager-f64b75456-8lbqh\" (UID: \"dfd8095e-5e8e-41a0-9086-a0cd90817076\") " pod="openshift-route-controller-manager/route-controller-manager-f64b75456-8lbqh" Feb 23 00:10:48 crc kubenswrapper[4735]: I0223 00:10:48.144002 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfd8095e-5e8e-41a0-9086-a0cd90817076-serving-cert\") pod \"route-controller-manager-f64b75456-8lbqh\" (UID: \"dfd8095e-5e8e-41a0-9086-a0cd90817076\") " pod="openshift-route-controller-manager/route-controller-manager-f64b75456-8lbqh" Feb 23 00:10:48 crc kubenswrapper[4735]: I0223 00:10:48.162157 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54bpx\" (UniqueName: \"kubernetes.io/projected/dfd8095e-5e8e-41a0-9086-a0cd90817076-kube-api-access-54bpx\") pod \"route-controller-manager-f64b75456-8lbqh\" (UID: \"dfd8095e-5e8e-41a0-9086-a0cd90817076\") " pod="openshift-route-controller-manager/route-controller-manager-f64b75456-8lbqh" Feb 23 00:10:48 crc kubenswrapper[4735]: I0223 00:10:48.182654 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f64b75456-8lbqh" Feb 23 00:10:48 crc kubenswrapper[4735]: I0223 00:10:48.188808 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7cc59c78dd-gv49x"] Feb 23 00:10:48 crc kubenswrapper[4735]: W0223 00:10:48.205526 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ec6d401_e9a0_4981_9f41_3c07d8bda018.slice/crio-7cd37bf239601031d5d2d70bfbc190f11552b64caf8084543fa352b34b1467c8 WatchSource:0}: Error finding container 7cd37bf239601031d5d2d70bfbc190f11552b64caf8084543fa352b34b1467c8: Status 404 returned error can't find the container with id 7cd37bf239601031d5d2d70bfbc190f11552b64caf8084543fa352b34b1467c8 Feb 23 00:10:48 crc kubenswrapper[4735]: I0223 00:10:48.285110 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="756348da-76db-4458-a280-488a6c79756a" path="/var/lib/kubelet/pods/756348da-76db-4458-a280-488a6c79756a/volumes" Feb 23 00:10:48 crc kubenswrapper[4735]: I0223 00:10:48.285774 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78e0db83-6432-4c4c-bff1-e5134760124e" path="/var/lib/kubelet/pods/78e0db83-6432-4c4c-bff1-e5134760124e/volumes" Feb 23 00:10:48 crc kubenswrapper[4735]: I0223 00:10:48.420463 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5xl8z" Feb 23 00:10:48 crc kubenswrapper[4735]: I0223 00:10:48.420469 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5xl8z" event={"ID":"6b4dd4f4-5ead-4f6f-a993-34effb6df863","Type":"ContainerDied","Data":"a82dd4bb37a675b0056a2017fc7d3f0ce1ceab08fbe58618cd5030097425f57d"} Feb 23 00:10:48 crc kubenswrapper[4735]: I0223 00:10:48.420915 4735 scope.go:117] "RemoveContainer" containerID="b292cf6aa732ca4e252a8296bae3efff4a11e17e9228ed5790004eceb031be7b" Feb 23 00:10:48 crc kubenswrapper[4735]: I0223 00:10:48.449388 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cc59c78dd-gv49x" event={"ID":"9ec6d401-e9a0-4981-9f41-3c07d8bda018","Type":"ContainerStarted","Data":"7cd37bf239601031d5d2d70bfbc190f11552b64caf8084543fa352b34b1467c8"} Feb 23 00:10:48 crc kubenswrapper[4735]: I0223 00:10:48.449477 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jvnk7" Feb 23 00:10:48 crc kubenswrapper[4735]: I0223 00:10:48.451393 4735 scope.go:117] "RemoveContainer" containerID="bd9f026fad72802ff92fbd116725a4cfd5106959505c59780ea9a34da5e19436" Feb 23 00:10:48 crc kubenswrapper[4735]: I0223 00:10:48.451908 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-74kxk" Feb 23 00:10:48 crc kubenswrapper[4735]: I0223 00:10:48.461284 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5xl8z"] Feb 23 00:10:48 crc kubenswrapper[4735]: I0223 00:10:48.468215 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5xl8z"] Feb 23 00:10:48 crc kubenswrapper[4735]: I0223 00:10:48.478086 4735 scope.go:117] "RemoveContainer" containerID="4bf02ec4c27f762900c43e5d6f61665a6dd764f6f5045271f71b90a55f54e852" Feb 23 00:10:48 crc kubenswrapper[4735]: I0223 00:10:48.479633 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jvnk7"] Feb 23 00:10:48 crc kubenswrapper[4735]: I0223 00:10:48.482920 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jvnk7"] Feb 23 00:10:48 crc kubenswrapper[4735]: I0223 00:10:48.498017 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-74kxk"] Feb 23 00:10:48 crc kubenswrapper[4735]: I0223 00:10:48.502036 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-74kxk"] Feb 23 00:10:48 crc kubenswrapper[4735]: I0223 00:10:48.697726 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f64b75456-8lbqh"] Feb 23 00:10:48 crc kubenswrapper[4735]: W0223 00:10:48.717974 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfd8095e_5e8e_41a0_9086_a0cd90817076.slice/crio-c43ba4b7a8fc2d7ba9d3a2e361a8cd5fb39cd6d176ea8dd113dac17783c4e45e WatchSource:0}: Error finding container c43ba4b7a8fc2d7ba9d3a2e361a8cd5fb39cd6d176ea8dd113dac17783c4e45e: Status 404 returned error can't find the container with id c43ba4b7a8fc2d7ba9d3a2e361a8cd5fb39cd6d176ea8dd113dac17783c4e45e Feb 23 00:10:49 crc kubenswrapper[4735]: I0223 00:10:49.457240 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f64b75456-8lbqh" event={"ID":"dfd8095e-5e8e-41a0-9086-a0cd90817076","Type":"ContainerStarted","Data":"16da3c1b6aa98b49ebac0b2f2e09f8d92520a47454bdf9eac9818218ba927afe"} Feb 23 00:10:49 crc kubenswrapper[4735]: I0223 00:10:49.457298 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f64b75456-8lbqh" event={"ID":"dfd8095e-5e8e-41a0-9086-a0cd90817076","Type":"ContainerStarted","Data":"c43ba4b7a8fc2d7ba9d3a2e361a8cd5fb39cd6d176ea8dd113dac17783c4e45e"} Feb 23 00:10:49 crc kubenswrapper[4735]: I0223 00:10:49.457596 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-f64b75456-8lbqh" Feb 23 00:10:49 crc kubenswrapper[4735]: I0223 00:10:49.467791 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cc59c78dd-gv49x" event={"ID":"9ec6d401-e9a0-4981-9f41-3c07d8bda018","Type":"ContainerStarted","Data":"a2badf04c1e027a2d1c952841116544a8b1d1a439814613753000bc3aff0d1c5"} Feb 23 00:10:49 crc kubenswrapper[4735]: I0223 00:10:49.468070 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7cc59c78dd-gv49x" Feb 23 00:10:49 crc kubenswrapper[4735]: I0223 00:10:49.469164 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-f64b75456-8lbqh" Feb 23 00:10:49 crc kubenswrapper[4735]: I0223 00:10:49.481080 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7cc59c78dd-gv49x" Feb 23 00:10:49 crc kubenswrapper[4735]: I0223 00:10:49.482991 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-f64b75456-8lbqh" podStartSLOduration=3.482979803 podStartE2EDuration="3.482979803s" podCreationTimestamp="2026-02-23 00:10:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:10:49.480101315 +0000 UTC m=+207.943647286" watchObservedRunningTime="2026-02-23 00:10:49.482979803 +0000 UTC m=+207.946525774" Feb 23 00:10:49 crc kubenswrapper[4735]: I0223 00:10:49.543841 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7cc59c78dd-gv49x" podStartSLOduration=3.543823606 podStartE2EDuration="3.543823606s" podCreationTimestamp="2026-02-23 00:10:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:10:49.538173762 +0000 UTC m=+208.001719733" watchObservedRunningTime="2026-02-23 00:10:49.543823606 +0000 UTC m=+208.007369577" Feb 23 00:10:50 crc kubenswrapper[4735]: I0223 00:10:50.282574 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e33bc37-4c54-4f50-95b7-bd2cf2da176e" path="/var/lib/kubelet/pods/0e33bc37-4c54-4f50-95b7-bd2cf2da176e/volumes" Feb 23 00:10:50 crc kubenswrapper[4735]: I0223 00:10:50.283981 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b4dd4f4-5ead-4f6f-a993-34effb6df863" path="/var/lib/kubelet/pods/6b4dd4f4-5ead-4f6f-a993-34effb6df863/volumes" Feb 23 00:10:50 crc kubenswrapper[4735]: I0223 00:10:50.286294 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf538d4f-1acd-4e61-9827-7430d1099138" path="/var/lib/kubelet/pods/cf538d4f-1acd-4e61-9827-7430d1099138/volumes" Feb 23 00:10:51 crc kubenswrapper[4735]: I0223 00:10:51.845565 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-58cd8c9949-bhgqw"] Feb 23 00:10:51 crc kubenswrapper[4735]: I0223 00:10:51.847198 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:51 crc kubenswrapper[4735]: I0223 00:10:51.851472 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 23 00:10:51 crc kubenswrapper[4735]: I0223 00:10:51.852149 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 23 00:10:51 crc kubenswrapper[4735]: I0223 00:10:51.852490 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 23 00:10:51 crc kubenswrapper[4735]: I0223 00:10:51.857010 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 23 00:10:51 crc kubenswrapper[4735]: I0223 00:10:51.857327 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 23 00:10:51 crc kubenswrapper[4735]: I0223 00:10:51.857582 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 23 00:10:51 crc kubenswrapper[4735]: I0223 00:10:51.862639 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 23 00:10:51 crc kubenswrapper[4735]: I0223 00:10:51.862665 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 23 00:10:51 crc kubenswrapper[4735]: I0223 00:10:51.862926 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 23 00:10:51 crc kubenswrapper[4735]: I0223 00:10:51.863252 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 23 00:10:51 crc kubenswrapper[4735]: I0223 00:10:51.863239 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 23 00:10:51 crc kubenswrapper[4735]: I0223 00:10:51.863313 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 23 00:10:51 crc kubenswrapper[4735]: I0223 00:10:51.889979 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 23 00:10:51 crc kubenswrapper[4735]: I0223 00:10:51.890040 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 23 00:10:51 crc kubenswrapper[4735]: I0223 00:10:51.897256 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 23 00:10:51 crc kubenswrapper[4735]: I0223 00:10:51.897873 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-58cd8c9949-bhgqw"] Feb 23 00:10:51 crc kubenswrapper[4735]: I0223 00:10:51.997585 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/628644e2-6af6-42c7-ae37-a7263e67e7b6-audit-dir\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:51 crc kubenswrapper[4735]: I0223 00:10:51.997932 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/628644e2-6af6-42c7-ae37-a7263e67e7b6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:51 crc kubenswrapper[4735]: I0223 00:10:51.998047 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/628644e2-6af6-42c7-ae37-a7263e67e7b6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:51 crc kubenswrapper[4735]: I0223 00:10:51.998210 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/628644e2-6af6-42c7-ae37-a7263e67e7b6-v4-0-config-user-template-login\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:51 crc kubenswrapper[4735]: I0223 00:10:51.998326 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/628644e2-6af6-42c7-ae37-a7263e67e7b6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:51 crc kubenswrapper[4735]: I0223 00:10:51.998438 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/628644e2-6af6-42c7-ae37-a7263e67e7b6-v4-0-config-system-service-ca\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:51 crc kubenswrapper[4735]: I0223 00:10:51.998555 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/628644e2-6af6-42c7-ae37-a7263e67e7b6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:51 crc kubenswrapper[4735]: I0223 00:10:51.998687 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsj2p\" (UniqueName: \"kubernetes.io/projected/628644e2-6af6-42c7-ae37-a7263e67e7b6-kube-api-access-hsj2p\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:51 crc kubenswrapper[4735]: I0223 00:10:51.998821 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/628644e2-6af6-42c7-ae37-a7263e67e7b6-v4-0-config-system-session\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:51 crc kubenswrapper[4735]: I0223 00:10:51.998943 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/628644e2-6af6-42c7-ae37-a7263e67e7b6-v4-0-config-user-template-error\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:51 crc kubenswrapper[4735]: I0223 00:10:51.999039 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/628644e2-6af6-42c7-ae37-a7263e67e7b6-audit-policies\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:51 crc kubenswrapper[4735]: I0223 00:10:51.999176 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/628644e2-6af6-42c7-ae37-a7263e67e7b6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:51 crc kubenswrapper[4735]: I0223 00:10:51.999308 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/628644e2-6af6-42c7-ae37-a7263e67e7b6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:51 crc kubenswrapper[4735]: I0223 00:10:51.999422 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/628644e2-6af6-42c7-ae37-a7263e67e7b6-v4-0-config-system-router-certs\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:52 crc kubenswrapper[4735]: I0223 00:10:52.100234 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsj2p\" (UniqueName: \"kubernetes.io/projected/628644e2-6af6-42c7-ae37-a7263e67e7b6-kube-api-access-hsj2p\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:52 crc kubenswrapper[4735]: I0223 00:10:52.100272 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/628644e2-6af6-42c7-ae37-a7263e67e7b6-v4-0-config-system-session\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:52 crc kubenswrapper[4735]: I0223 00:10:52.100293 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/628644e2-6af6-42c7-ae37-a7263e67e7b6-v4-0-config-user-template-error\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:52 crc kubenswrapper[4735]: I0223 00:10:52.100313 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/628644e2-6af6-42c7-ae37-a7263e67e7b6-audit-policies\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:52 crc kubenswrapper[4735]: I0223 00:10:52.100343 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/628644e2-6af6-42c7-ae37-a7263e67e7b6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:52 crc kubenswrapper[4735]: I0223 00:10:52.100362 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/628644e2-6af6-42c7-ae37-a7263e67e7b6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:52 crc kubenswrapper[4735]: I0223 00:10:52.100382 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/628644e2-6af6-42c7-ae37-a7263e67e7b6-v4-0-config-system-router-certs\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:52 crc kubenswrapper[4735]: I0223 00:10:52.100401 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/628644e2-6af6-42c7-ae37-a7263e67e7b6-audit-dir\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:52 crc kubenswrapper[4735]: I0223 00:10:52.100431 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/628644e2-6af6-42c7-ae37-a7263e67e7b6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:52 crc kubenswrapper[4735]: I0223 00:10:52.100463 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/628644e2-6af6-42c7-ae37-a7263e67e7b6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:52 crc kubenswrapper[4735]: I0223 00:10:52.100483 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/628644e2-6af6-42c7-ae37-a7263e67e7b6-v4-0-config-user-template-login\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:52 crc kubenswrapper[4735]: I0223 00:10:52.100500 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/628644e2-6af6-42c7-ae37-a7263e67e7b6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:52 crc kubenswrapper[4735]: I0223 00:10:52.100521 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/628644e2-6af6-42c7-ae37-a7263e67e7b6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:52 crc kubenswrapper[4735]: I0223 00:10:52.100552 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/628644e2-6af6-42c7-ae37-a7263e67e7b6-v4-0-config-system-service-ca\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:52 crc kubenswrapper[4735]: I0223 00:10:52.101240 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/628644e2-6af6-42c7-ae37-a7263e67e7b6-v4-0-config-system-service-ca\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:52 crc kubenswrapper[4735]: I0223 00:10:52.101286 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/628644e2-6af6-42c7-ae37-a7263e67e7b6-audit-dir\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:52 crc kubenswrapper[4735]: I0223 00:10:52.104088 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/628644e2-6af6-42c7-ae37-a7263e67e7b6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:52 crc kubenswrapper[4735]: I0223 00:10:52.104172 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/628644e2-6af6-42c7-ae37-a7263e67e7b6-audit-policies\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:52 crc kubenswrapper[4735]: I0223 00:10:52.104595 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/628644e2-6af6-42c7-ae37-a7263e67e7b6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:52 crc kubenswrapper[4735]: I0223 00:10:52.106692 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/628644e2-6af6-42c7-ae37-a7263e67e7b6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:52 crc kubenswrapper[4735]: I0223 00:10:52.106837 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/628644e2-6af6-42c7-ae37-a7263e67e7b6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:52 crc kubenswrapper[4735]: I0223 00:10:52.114013 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/628644e2-6af6-42c7-ae37-a7263e67e7b6-v4-0-config-system-router-certs\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:52 crc kubenswrapper[4735]: I0223 00:10:52.114916 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/628644e2-6af6-42c7-ae37-a7263e67e7b6-v4-0-config-user-template-error\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:52 crc kubenswrapper[4735]: I0223 00:10:52.115005 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/628644e2-6af6-42c7-ae37-a7263e67e7b6-v4-0-config-user-template-login\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:52 crc kubenswrapper[4735]: I0223 00:10:52.117774 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/628644e2-6af6-42c7-ae37-a7263e67e7b6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:52 crc kubenswrapper[4735]: I0223 00:10:52.120106 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/628644e2-6af6-42c7-ae37-a7263e67e7b6-v4-0-config-system-session\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:52 crc kubenswrapper[4735]: I0223 00:10:52.123033 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/628644e2-6af6-42c7-ae37-a7263e67e7b6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:52 crc kubenswrapper[4735]: I0223 00:10:52.126766 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsj2p\" (UniqueName: \"kubernetes.io/projected/628644e2-6af6-42c7-ae37-a7263e67e7b6-kube-api-access-hsj2p\") pod \"oauth-openshift-58cd8c9949-bhgqw\" (UID: \"628644e2-6af6-42c7-ae37-a7263e67e7b6\") " pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:52 crc kubenswrapper[4735]: I0223 00:10:52.189083 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:52 crc kubenswrapper[4735]: I0223 00:10:52.664299 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-58cd8c9949-bhgqw"] Feb 23 00:10:52 crc kubenswrapper[4735]: W0223 00:10:52.673410 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod628644e2_6af6_42c7_ae37_a7263e67e7b6.slice/crio-cbef89b42477ec48e7ab0cd761f0ddf653479c36f58d98183fcb2a0740be8051 WatchSource:0}: Error finding container cbef89b42477ec48e7ab0cd761f0ddf653479c36f58d98183fcb2a0740be8051: Status 404 returned error can't find the container with id cbef89b42477ec48e7ab0cd761f0ddf653479c36f58d98183fcb2a0740be8051 Feb 23 00:10:53 crc kubenswrapper[4735]: I0223 00:10:53.499892 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" event={"ID":"628644e2-6af6-42c7-ae37-a7263e67e7b6","Type":"ContainerStarted","Data":"d9180c5307064031d2694b9d4be6231c6d694c23ccf527b92bc3c7049ac267d8"} Feb 23 00:10:53 crc kubenswrapper[4735]: I0223 00:10:53.500521 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" event={"ID":"628644e2-6af6-42c7-ae37-a7263e67e7b6","Type":"ContainerStarted","Data":"cbef89b42477ec48e7ab0cd761f0ddf653479c36f58d98183fcb2a0740be8051"} Feb 23 00:10:53 crc kubenswrapper[4735]: I0223 00:10:53.500569 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:53 crc kubenswrapper[4735]: I0223 00:10:53.521988 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" Feb 23 00:10:53 crc kubenswrapper[4735]: I0223 00:10:53.539334 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-58cd8c9949-bhgqw" podStartSLOduration=32.539298912 podStartE2EDuration="32.539298912s" podCreationTimestamp="2026-02-23 00:10:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:10:53.531647243 +0000 UTC m=+211.995193224" watchObservedRunningTime="2026-02-23 00:10:53.539298912 +0000 UTC m=+212.002844933" Feb 23 00:11:06 crc kubenswrapper[4735]: I0223 00:11:06.133113 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7cc59c78dd-gv49x"] Feb 23 00:11:06 crc kubenswrapper[4735]: I0223 00:11:06.133802 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7cc59c78dd-gv49x" podUID="9ec6d401-e9a0-4981-9f41-3c07d8bda018" containerName="controller-manager" containerID="cri-o://a2badf04c1e027a2d1c952841116544a8b1d1a439814613753000bc3aff0d1c5" gracePeriod=30 Feb 23 00:11:06 crc kubenswrapper[4735]: I0223 00:11:06.579473 4735 generic.go:334] "Generic (PLEG): container finished" podID="9ec6d401-e9a0-4981-9f41-3c07d8bda018" containerID="a2badf04c1e027a2d1c952841116544a8b1d1a439814613753000bc3aff0d1c5" exitCode=0 Feb 23 00:11:06 crc kubenswrapper[4735]: I0223 00:11:06.579572 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cc59c78dd-gv49x" event={"ID":"9ec6d401-e9a0-4981-9f41-3c07d8bda018","Type":"ContainerDied","Data":"a2badf04c1e027a2d1c952841116544a8b1d1a439814613753000bc3aff0d1c5"} Feb 23 00:11:06 crc kubenswrapper[4735]: I0223 00:11:06.723086 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cc59c78dd-gv49x" Feb 23 00:11:06 crc kubenswrapper[4735]: I0223 00:11:06.919665 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ec6d401-e9a0-4981-9f41-3c07d8bda018-config\") pod \"9ec6d401-e9a0-4981-9f41-3c07d8bda018\" (UID: \"9ec6d401-e9a0-4981-9f41-3c07d8bda018\") " Feb 23 00:11:06 crc kubenswrapper[4735]: I0223 00:11:06.919776 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ck99h\" (UniqueName: \"kubernetes.io/projected/9ec6d401-e9a0-4981-9f41-3c07d8bda018-kube-api-access-ck99h\") pod \"9ec6d401-e9a0-4981-9f41-3c07d8bda018\" (UID: \"9ec6d401-e9a0-4981-9f41-3c07d8bda018\") " Feb 23 00:11:06 crc kubenswrapper[4735]: I0223 00:11:06.919838 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9ec6d401-e9a0-4981-9f41-3c07d8bda018-client-ca\") pod \"9ec6d401-e9a0-4981-9f41-3c07d8bda018\" (UID: \"9ec6d401-e9a0-4981-9f41-3c07d8bda018\") " Feb 23 00:11:06 crc kubenswrapper[4735]: I0223 00:11:06.919917 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ec6d401-e9a0-4981-9f41-3c07d8bda018-serving-cert\") pod \"9ec6d401-e9a0-4981-9f41-3c07d8bda018\" (UID: \"9ec6d401-e9a0-4981-9f41-3c07d8bda018\") " Feb 23 00:11:06 crc kubenswrapper[4735]: I0223 00:11:06.919959 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9ec6d401-e9a0-4981-9f41-3c07d8bda018-proxy-ca-bundles\") pod \"9ec6d401-e9a0-4981-9f41-3c07d8bda018\" (UID: \"9ec6d401-e9a0-4981-9f41-3c07d8bda018\") " Feb 23 00:11:06 crc kubenswrapper[4735]: I0223 00:11:06.920663 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ec6d401-e9a0-4981-9f41-3c07d8bda018-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9ec6d401-e9a0-4981-9f41-3c07d8bda018" (UID: "9ec6d401-e9a0-4981-9f41-3c07d8bda018"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:11:06 crc kubenswrapper[4735]: I0223 00:11:06.920712 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ec6d401-e9a0-4981-9f41-3c07d8bda018-client-ca" (OuterVolumeSpecName: "client-ca") pod "9ec6d401-e9a0-4981-9f41-3c07d8bda018" (UID: "9ec6d401-e9a0-4981-9f41-3c07d8bda018"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:11:06 crc kubenswrapper[4735]: I0223 00:11:06.921022 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ec6d401-e9a0-4981-9f41-3c07d8bda018-config" (OuterVolumeSpecName: "config") pod "9ec6d401-e9a0-4981-9f41-3c07d8bda018" (UID: "9ec6d401-e9a0-4981-9f41-3c07d8bda018"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:11:06 crc kubenswrapper[4735]: I0223 00:11:06.924971 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ec6d401-e9a0-4981-9f41-3c07d8bda018-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9ec6d401-e9a0-4981-9f41-3c07d8bda018" (UID: "9ec6d401-e9a0-4981-9f41-3c07d8bda018"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:11:06 crc kubenswrapper[4735]: I0223 00:11:06.925394 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ec6d401-e9a0-4981-9f41-3c07d8bda018-kube-api-access-ck99h" (OuterVolumeSpecName: "kube-api-access-ck99h") pod "9ec6d401-e9a0-4981-9f41-3c07d8bda018" (UID: "9ec6d401-e9a0-4981-9f41-3c07d8bda018"). InnerVolumeSpecName "kube-api-access-ck99h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:11:07 crc kubenswrapper[4735]: I0223 00:11:07.021093 4735 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ec6d401-e9a0-4981-9f41-3c07d8bda018-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:11:07 crc kubenswrapper[4735]: I0223 00:11:07.021140 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ck99h\" (UniqueName: \"kubernetes.io/projected/9ec6d401-e9a0-4981-9f41-3c07d8bda018-kube-api-access-ck99h\") on node \"crc\" DevicePath \"\"" Feb 23 00:11:07 crc kubenswrapper[4735]: I0223 00:11:07.021154 4735 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9ec6d401-e9a0-4981-9f41-3c07d8bda018-client-ca\") on node \"crc\" DevicePath \"\"" Feb 23 00:11:07 crc kubenswrapper[4735]: I0223 00:11:07.021166 4735 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ec6d401-e9a0-4981-9f41-3c07d8bda018-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:11:07 crc kubenswrapper[4735]: I0223 00:11:07.021178 4735 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9ec6d401-e9a0-4981-9f41-3c07d8bda018-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 23 00:11:07 crc kubenswrapper[4735]: I0223 00:11:07.589703 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7cc59c78dd-gv49x" event={"ID":"9ec6d401-e9a0-4981-9f41-3c07d8bda018","Type":"ContainerDied","Data":"7cd37bf239601031d5d2d70bfbc190f11552b64caf8084543fa352b34b1467c8"} Feb 23 00:11:07 crc kubenswrapper[4735]: I0223 00:11:07.590104 4735 scope.go:117] "RemoveContainer" containerID="a2badf04c1e027a2d1c952841116544a8b1d1a439814613753000bc3aff0d1c5" Feb 23 00:11:07 crc kubenswrapper[4735]: I0223 00:11:07.589822 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7cc59c78dd-gv49x" Feb 23 00:11:07 crc kubenswrapper[4735]: I0223 00:11:07.640975 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7cc59c78dd-gv49x"] Feb 23 00:11:07 crc kubenswrapper[4735]: I0223 00:11:07.645028 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7cc59c78dd-gv49x"] Feb 23 00:11:07 crc kubenswrapper[4735]: I0223 00:11:07.849157 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-fbd5d9bb6-cv9sc"] Feb 23 00:11:07 crc kubenswrapper[4735]: E0223 00:11:07.849342 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ec6d401-e9a0-4981-9f41-3c07d8bda018" containerName="controller-manager" Feb 23 00:11:07 crc kubenswrapper[4735]: I0223 00:11:07.849354 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ec6d401-e9a0-4981-9f41-3c07d8bda018" containerName="controller-manager" Feb 23 00:11:07 crc kubenswrapper[4735]: I0223 00:11:07.849457 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ec6d401-e9a0-4981-9f41-3c07d8bda018" containerName="controller-manager" Feb 23 00:11:07 crc kubenswrapper[4735]: I0223 00:11:07.849814 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fbd5d9bb6-cv9sc" Feb 23 00:11:07 crc kubenswrapper[4735]: I0223 00:11:07.851863 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 23 00:11:07 crc kubenswrapper[4735]: I0223 00:11:07.852070 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 23 00:11:07 crc kubenswrapper[4735]: I0223 00:11:07.852338 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 23 00:11:07 crc kubenswrapper[4735]: I0223 00:11:07.852688 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 23 00:11:07 crc kubenswrapper[4735]: I0223 00:11:07.852842 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 23 00:11:07 crc kubenswrapper[4735]: I0223 00:11:07.855433 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 23 00:11:07 crc kubenswrapper[4735]: I0223 00:11:07.859167 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 23 00:11:07 crc kubenswrapper[4735]: I0223 00:11:07.865893 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-fbd5d9bb6-cv9sc"] Feb 23 00:11:08 crc kubenswrapper[4735]: I0223 00:11:08.031793 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9jws\" (UniqueName: \"kubernetes.io/projected/c2aaebe4-01e2-4c92-b302-1967751f004d-kube-api-access-l9jws\") pod \"controller-manager-fbd5d9bb6-cv9sc\" (UID: \"c2aaebe4-01e2-4c92-b302-1967751f004d\") " pod="openshift-controller-manager/controller-manager-fbd5d9bb6-cv9sc" Feb 23 00:11:08 crc kubenswrapper[4735]: I0223 00:11:08.031877 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2aaebe4-01e2-4c92-b302-1967751f004d-serving-cert\") pod \"controller-manager-fbd5d9bb6-cv9sc\" (UID: \"c2aaebe4-01e2-4c92-b302-1967751f004d\") " pod="openshift-controller-manager/controller-manager-fbd5d9bb6-cv9sc" Feb 23 00:11:08 crc kubenswrapper[4735]: I0223 00:11:08.031929 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2aaebe4-01e2-4c92-b302-1967751f004d-client-ca\") pod \"controller-manager-fbd5d9bb6-cv9sc\" (UID: \"c2aaebe4-01e2-4c92-b302-1967751f004d\") " pod="openshift-controller-manager/controller-manager-fbd5d9bb6-cv9sc" Feb 23 00:11:08 crc kubenswrapper[4735]: I0223 00:11:08.031976 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2aaebe4-01e2-4c92-b302-1967751f004d-config\") pod \"controller-manager-fbd5d9bb6-cv9sc\" (UID: \"c2aaebe4-01e2-4c92-b302-1967751f004d\") " pod="openshift-controller-manager/controller-manager-fbd5d9bb6-cv9sc" Feb 23 00:11:08 crc kubenswrapper[4735]: I0223 00:11:08.031997 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c2aaebe4-01e2-4c92-b302-1967751f004d-proxy-ca-bundles\") pod \"controller-manager-fbd5d9bb6-cv9sc\" (UID: \"c2aaebe4-01e2-4c92-b302-1967751f004d\") " pod="openshift-controller-manager/controller-manager-fbd5d9bb6-cv9sc" Feb 23 00:11:08 crc kubenswrapper[4735]: I0223 00:11:08.133073 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9jws\" (UniqueName: \"kubernetes.io/projected/c2aaebe4-01e2-4c92-b302-1967751f004d-kube-api-access-l9jws\") pod \"controller-manager-fbd5d9bb6-cv9sc\" (UID: \"c2aaebe4-01e2-4c92-b302-1967751f004d\") " pod="openshift-controller-manager/controller-manager-fbd5d9bb6-cv9sc" Feb 23 00:11:08 crc kubenswrapper[4735]: I0223 00:11:08.133129 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2aaebe4-01e2-4c92-b302-1967751f004d-serving-cert\") pod \"controller-manager-fbd5d9bb6-cv9sc\" (UID: \"c2aaebe4-01e2-4c92-b302-1967751f004d\") " pod="openshift-controller-manager/controller-manager-fbd5d9bb6-cv9sc" Feb 23 00:11:08 crc kubenswrapper[4735]: I0223 00:11:08.133184 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2aaebe4-01e2-4c92-b302-1967751f004d-client-ca\") pod \"controller-manager-fbd5d9bb6-cv9sc\" (UID: \"c2aaebe4-01e2-4c92-b302-1967751f004d\") " pod="openshift-controller-manager/controller-manager-fbd5d9bb6-cv9sc" Feb 23 00:11:08 crc kubenswrapper[4735]: I0223 00:11:08.133233 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2aaebe4-01e2-4c92-b302-1967751f004d-config\") pod \"controller-manager-fbd5d9bb6-cv9sc\" (UID: \"c2aaebe4-01e2-4c92-b302-1967751f004d\") " pod="openshift-controller-manager/controller-manager-fbd5d9bb6-cv9sc" Feb 23 00:11:08 crc kubenswrapper[4735]: I0223 00:11:08.133255 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c2aaebe4-01e2-4c92-b302-1967751f004d-proxy-ca-bundles\") pod \"controller-manager-fbd5d9bb6-cv9sc\" (UID: \"c2aaebe4-01e2-4c92-b302-1967751f004d\") " pod="openshift-controller-manager/controller-manager-fbd5d9bb6-cv9sc" Feb 23 00:11:08 crc kubenswrapper[4735]: I0223 00:11:08.134601 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2aaebe4-01e2-4c92-b302-1967751f004d-client-ca\") pod \"controller-manager-fbd5d9bb6-cv9sc\" (UID: \"c2aaebe4-01e2-4c92-b302-1967751f004d\") " pod="openshift-controller-manager/controller-manager-fbd5d9bb6-cv9sc" Feb 23 00:11:08 crc kubenswrapper[4735]: I0223 00:11:08.135113 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2aaebe4-01e2-4c92-b302-1967751f004d-config\") pod \"controller-manager-fbd5d9bb6-cv9sc\" (UID: \"c2aaebe4-01e2-4c92-b302-1967751f004d\") " pod="openshift-controller-manager/controller-manager-fbd5d9bb6-cv9sc" Feb 23 00:11:08 crc kubenswrapper[4735]: I0223 00:11:08.136072 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c2aaebe4-01e2-4c92-b302-1967751f004d-proxy-ca-bundles\") pod \"controller-manager-fbd5d9bb6-cv9sc\" (UID: \"c2aaebe4-01e2-4c92-b302-1967751f004d\") " pod="openshift-controller-manager/controller-manager-fbd5d9bb6-cv9sc" Feb 23 00:11:08 crc kubenswrapper[4735]: I0223 00:11:08.142365 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2aaebe4-01e2-4c92-b302-1967751f004d-serving-cert\") pod \"controller-manager-fbd5d9bb6-cv9sc\" (UID: \"c2aaebe4-01e2-4c92-b302-1967751f004d\") " pod="openshift-controller-manager/controller-manager-fbd5d9bb6-cv9sc" Feb 23 00:11:08 crc kubenswrapper[4735]: I0223 00:11:08.164163 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9jws\" (UniqueName: \"kubernetes.io/projected/c2aaebe4-01e2-4c92-b302-1967751f004d-kube-api-access-l9jws\") pod \"controller-manager-fbd5d9bb6-cv9sc\" (UID: \"c2aaebe4-01e2-4c92-b302-1967751f004d\") " pod="openshift-controller-manager/controller-manager-fbd5d9bb6-cv9sc" Feb 23 00:11:08 crc kubenswrapper[4735]: I0223 00:11:08.194418 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fbd5d9bb6-cv9sc" Feb 23 00:11:08 crc kubenswrapper[4735]: I0223 00:11:08.283274 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ec6d401-e9a0-4981-9f41-3c07d8bda018" path="/var/lib/kubelet/pods/9ec6d401-e9a0-4981-9f41-3c07d8bda018/volumes" Feb 23 00:11:08 crc kubenswrapper[4735]: I0223 00:11:08.472364 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-fbd5d9bb6-cv9sc"] Feb 23 00:11:08 crc kubenswrapper[4735]: I0223 00:11:08.600428 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fbd5d9bb6-cv9sc" event={"ID":"c2aaebe4-01e2-4c92-b302-1967751f004d","Type":"ContainerStarted","Data":"0385ec7f01245c041d66242d77427a7dbe3f2cd75c8aa0cdb783eff18d780e0a"} Feb 23 00:11:09 crc kubenswrapper[4735]: I0223 00:11:09.612198 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fbd5d9bb6-cv9sc" event={"ID":"c2aaebe4-01e2-4c92-b302-1967751f004d","Type":"ContainerStarted","Data":"51d817453410333c7882dac4a57f6ada42bc7c44badb32056745e271e378fcdf"} Feb 23 00:11:09 crc kubenswrapper[4735]: I0223 00:11:09.612613 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-fbd5d9bb6-cv9sc" Feb 23 00:11:09 crc kubenswrapper[4735]: I0223 00:11:09.621116 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-fbd5d9bb6-cv9sc" Feb 23 00:11:09 crc kubenswrapper[4735]: I0223 00:11:09.638955 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-fbd5d9bb6-cv9sc" podStartSLOduration=3.638929976 podStartE2EDuration="3.638929976s" podCreationTimestamp="2026-02-23 00:11:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:11:09.637759374 +0000 UTC m=+228.101305375" watchObservedRunningTime="2026-02-23 00:11:09.638929976 +0000 UTC m=+228.102475977" Feb 23 00:11:11 crc kubenswrapper[4735]: I0223 00:11:11.512514 4735 patch_prober.go:28] interesting pod/machine-config-daemon-blmnv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 00:11:11 crc kubenswrapper[4735]: I0223 00:11:11.512953 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" podUID="1cba474f-2d55-4a07-969f-25e2817a06d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 00:11:11 crc kubenswrapper[4735]: I0223 00:11:11.513038 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" Feb 23 00:11:11 crc kubenswrapper[4735]: I0223 00:11:11.514018 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0"} pod="openshift-machine-config-operator/machine-config-daemon-blmnv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 00:11:11 crc kubenswrapper[4735]: I0223 00:11:11.514146 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" podUID="1cba474f-2d55-4a07-969f-25e2817a06d0" containerName="machine-config-daemon" containerID="cri-o://908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0" gracePeriod=600 Feb 23 00:11:12 crc kubenswrapper[4735]: I0223 00:11:12.636579 4735 generic.go:334] "Generic (PLEG): container finished" podID="1cba474f-2d55-4a07-969f-25e2817a06d0" containerID="908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0" exitCode=0 Feb 23 00:11:12 crc kubenswrapper[4735]: I0223 00:11:12.636793 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" event={"ID":"1cba474f-2d55-4a07-969f-25e2817a06d0","Type":"ContainerDied","Data":"908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0"} Feb 23 00:11:12 crc kubenswrapper[4735]: I0223 00:11:12.637641 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" event={"ID":"1cba474f-2d55-4a07-969f-25e2817a06d0","Type":"ContainerStarted","Data":"32a3e61de17574fe655e2f95d40d79b68f89d06de483f3a878f524bc13ce427d"} Feb 23 00:11:18 crc kubenswrapper[4735]: E0223 00:11:18.810957 4735 file.go:109] "Unable to process watch event" err="can't process config file \"/etc/kubernetes/manifests/kube-apiserver-startup-monitor-pod.yaml\": /etc/kubernetes/manifests/kube-apiserver-startup-monitor-pod.yaml: couldn't parse as pod(Object 'Kind' is missing in 'null'), please check config file" Feb 23 00:11:18 crc kubenswrapper[4735]: I0223 00:11:18.812810 4735 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 23 00:11:18 crc kubenswrapper[4735]: I0223 00:11:18.813706 4735 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 23 00:11:18 crc kubenswrapper[4735]: I0223 00:11:18.813867 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 00:11:18 crc kubenswrapper[4735]: I0223 00:11:18.814043 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f" gracePeriod=15 Feb 23 00:11:18 crc kubenswrapper[4735]: I0223 00:11:18.814085 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765" gracePeriod=15 Feb 23 00:11:18 crc kubenswrapper[4735]: I0223 00:11:18.814168 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549" gracePeriod=15 Feb 23 00:11:18 crc kubenswrapper[4735]: I0223 00:11:18.814197 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3" gracePeriod=15 Feb 23 00:11:18 crc kubenswrapper[4735]: I0223 00:11:18.814053 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777" gracePeriod=15 Feb 23 00:11:18 crc kubenswrapper[4735]: I0223 00:11:18.815843 4735 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 23 00:11:18 crc kubenswrapper[4735]: E0223 00:11:18.816118 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 23 00:11:18 crc kubenswrapper[4735]: I0223 00:11:18.816198 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 23 00:11:18 crc kubenswrapper[4735]: E0223 00:11:18.816217 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 23 00:11:18 crc kubenswrapper[4735]: I0223 00:11:18.816225 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 23 00:11:18 crc kubenswrapper[4735]: E0223 00:11:18.816236 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 23 00:11:18 crc kubenswrapper[4735]: I0223 00:11:18.816246 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 23 00:11:18 crc kubenswrapper[4735]: E0223 00:11:18.816262 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 00:11:18 crc kubenswrapper[4735]: I0223 00:11:18.816270 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 00:11:18 crc kubenswrapper[4735]: E0223 00:11:18.816281 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 23 00:11:18 crc kubenswrapper[4735]: I0223 00:11:18.816288 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 23 00:11:18 crc kubenswrapper[4735]: E0223 00:11:18.816298 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 00:11:18 crc kubenswrapper[4735]: I0223 00:11:18.816305 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 00:11:18 crc kubenswrapper[4735]: E0223 00:11:18.816315 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 23 00:11:18 crc kubenswrapper[4735]: I0223 00:11:18.816325 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 23 00:11:18 crc kubenswrapper[4735]: I0223 00:11:18.816498 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 00:11:18 crc kubenswrapper[4735]: I0223 00:11:18.816515 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 23 00:11:18 crc kubenswrapper[4735]: I0223 00:11:18.816524 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 23 00:11:18 crc kubenswrapper[4735]: I0223 00:11:18.816538 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 23 00:11:18 crc kubenswrapper[4735]: I0223 00:11:18.816573 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 23 00:11:18 crc kubenswrapper[4735]: I0223 00:11:18.816828 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 23 00:11:18 crc kubenswrapper[4735]: I0223 00:11:18.902350 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 00:11:18 crc kubenswrapper[4735]: I0223 00:11:18.902393 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 00:11:18 crc kubenswrapper[4735]: I0223 00:11:18.902456 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 00:11:18 crc kubenswrapper[4735]: I0223 00:11:18.902551 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 00:11:18 crc kubenswrapper[4735]: I0223 00:11:18.902591 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:11:18 crc kubenswrapper[4735]: I0223 00:11:18.902618 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 00:11:18 crc kubenswrapper[4735]: I0223 00:11:18.902645 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:11:18 crc kubenswrapper[4735]: I0223 00:11:18.902679 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:11:19 crc kubenswrapper[4735]: I0223 00:11:19.004035 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 00:11:19 crc kubenswrapper[4735]: I0223 00:11:19.004104 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 00:11:19 crc kubenswrapper[4735]: I0223 00:11:19.004186 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 00:11:19 crc kubenswrapper[4735]: I0223 00:11:19.004338 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:11:19 crc kubenswrapper[4735]: I0223 00:11:19.004390 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 00:11:19 crc kubenswrapper[4735]: I0223 00:11:19.004221 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 00:11:19 crc kubenswrapper[4735]: I0223 00:11:19.004434 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 00:11:19 crc kubenswrapper[4735]: I0223 00:11:19.004398 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:11:19 crc kubenswrapper[4735]: I0223 00:11:19.004457 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:11:19 crc kubenswrapper[4735]: I0223 00:11:19.004505 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:11:19 crc kubenswrapper[4735]: I0223 00:11:19.004554 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:11:19 crc kubenswrapper[4735]: I0223 00:11:19.004554 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 00:11:19 crc kubenswrapper[4735]: I0223 00:11:19.004608 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 00:11:19 crc kubenswrapper[4735]: I0223 00:11:19.004612 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 00:11:19 crc kubenswrapper[4735]: I0223 00:11:19.004655 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:11:19 crc kubenswrapper[4735]: I0223 00:11:19.004634 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 00:11:19 crc kubenswrapper[4735]: I0223 00:11:19.229318 4735 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 23 00:11:19 crc kubenswrapper[4735]: I0223 00:11:19.229815 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 23 00:11:19 crc kubenswrapper[4735]: I0223 00:11:19.232105 4735 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Feb 23 00:11:19 crc kubenswrapper[4735]: I0223 00:11:19.232159 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" Feb 23 00:11:19 crc kubenswrapper[4735]: I0223 00:11:19.688271 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 23 00:11:19 crc kubenswrapper[4735]: I0223 00:11:19.690191 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 23 00:11:19 crc kubenswrapper[4735]: I0223 00:11:19.690935 4735 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777" exitCode=0 Feb 23 00:11:19 crc kubenswrapper[4735]: I0223 00:11:19.690969 4735 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3" exitCode=0 Feb 23 00:11:19 crc kubenswrapper[4735]: I0223 00:11:19.690984 4735 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765" exitCode=0 Feb 23 00:11:19 crc kubenswrapper[4735]: I0223 00:11:19.690996 4735 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549" exitCode=2 Feb 23 00:11:19 crc kubenswrapper[4735]: I0223 00:11:19.691078 4735 scope.go:117] "RemoveContainer" containerID="0961d80dd0f4cbce3f607e4aedc62f8de01c9b3a8ca68126a5ac981133c1c0f8" Feb 23 00:11:19 crc kubenswrapper[4735]: I0223 00:11:19.694597 4735 generic.go:334] "Generic (PLEG): container finished" podID="5fd08331-c55b-4c64-b731-546ab66801e7" containerID="d6e85572ac86fe5803be17e4efe2df9466228f93197373f40c3c8221a1a4c1ae" exitCode=0 Feb 23 00:11:19 crc kubenswrapper[4735]: I0223 00:11:19.694771 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5fd08331-c55b-4c64-b731-546ab66801e7","Type":"ContainerDied","Data":"d6e85572ac86fe5803be17e4efe2df9466228f93197373f40c3c8221a1a4c1ae"} Feb 23 00:11:19 crc kubenswrapper[4735]: I0223 00:11:19.696652 4735 status_manager.go:851] "Failed to get status for pod" podUID="5fd08331-c55b-4c64-b731-546ab66801e7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 23 00:11:19 crc kubenswrapper[4735]: I0223 00:11:19.697038 4735 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 23 00:11:20 crc kubenswrapper[4735]: I0223 00:11:20.705587 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.169338 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.170712 4735 status_manager.go:851] "Failed to get status for pod" podUID="5fd08331-c55b-4c64-b731-546ab66801e7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.174621 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.175319 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.175910 4735 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.176220 4735 status_manager.go:851] "Failed to get status for pod" podUID="5fd08331-c55b-4c64-b731-546ab66801e7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.229759 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5fd08331-c55b-4c64-b731-546ab66801e7-kubelet-dir\") pod \"5fd08331-c55b-4c64-b731-546ab66801e7\" (UID: \"5fd08331-c55b-4c64-b731-546ab66801e7\") " Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.229808 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.229839 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.229902 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5fd08331-c55b-4c64-b731-546ab66801e7-kube-api-access\") pod \"5fd08331-c55b-4c64-b731-546ab66801e7\" (UID: \"5fd08331-c55b-4c64-b731-546ab66801e7\") " Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.229907 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.229924 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.229928 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5fd08331-c55b-4c64-b731-546ab66801e7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5fd08331-c55b-4c64-b731-546ab66801e7" (UID: "5fd08331-c55b-4c64-b731-546ab66801e7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.229948 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.229966 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5fd08331-c55b-4c64-b731-546ab66801e7-var-lock\") pod \"5fd08331-c55b-4c64-b731-546ab66801e7\" (UID: \"5fd08331-c55b-4c64-b731-546ab66801e7\") " Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.230036 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.230083 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5fd08331-c55b-4c64-b731-546ab66801e7-var-lock" (OuterVolumeSpecName: "var-lock") pod "5fd08331-c55b-4c64-b731-546ab66801e7" (UID: "5fd08331-c55b-4c64-b731-546ab66801e7"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.230305 4735 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5fd08331-c55b-4c64-b731-546ab66801e7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.230319 4735 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.230327 4735 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.230335 4735 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.230342 4735 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5fd08331-c55b-4c64-b731-546ab66801e7-var-lock\") on node \"crc\" DevicePath \"\"" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.237458 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fd08331-c55b-4c64-b731-546ab66801e7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5fd08331-c55b-4c64-b731-546ab66801e7" (UID: "5fd08331-c55b-4c64-b731-546ab66801e7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.331872 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5fd08331-c55b-4c64-b731-546ab66801e7-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.713697 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.713703 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5fd08331-c55b-4c64-b731-546ab66801e7","Type":"ContainerDied","Data":"55549792dee1248da2cfdece269052d12fa6f42bcda807801ee5b2e7a8fc7b9f"} Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.713832 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55549792dee1248da2cfdece269052d12fa6f42bcda807801ee5b2e7a8fc7b9f" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.717093 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.717795 4735 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f" exitCode=0 Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.717844 4735 scope.go:117] "RemoveContainer" containerID="df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.717992 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.740452 4735 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.741304 4735 status_manager.go:851] "Failed to get status for pod" podUID="5fd08331-c55b-4c64-b731-546ab66801e7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.741900 4735 scope.go:117] "RemoveContainer" containerID="e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.745484 4735 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.745925 4735 status_manager.go:851] "Failed to get status for pod" podUID="5fd08331-c55b-4c64-b731-546ab66801e7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.765910 4735 scope.go:117] "RemoveContainer" containerID="1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.789140 4735 scope.go:117] "RemoveContainer" containerID="fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.811302 4735 scope.go:117] "RemoveContainer" containerID="fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.830298 4735 scope.go:117] "RemoveContainer" containerID="3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.857387 4735 scope.go:117] "RemoveContainer" containerID="df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777" Feb 23 00:11:21 crc kubenswrapper[4735]: E0223 00:11:21.857932 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\": container with ID starting with df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777 not found: ID does not exist" containerID="df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.857977 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777"} err="failed to get container status \"df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\": rpc error: code = NotFound desc = could not find container \"df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777\": container with ID starting with df7f22859b2478fcfa8b965e551e6fdcac13aa63437dd29da22159fd4ba1d777 not found: ID does not exist" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.858009 4735 scope.go:117] "RemoveContainer" containerID="e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3" Feb 23 00:11:21 crc kubenswrapper[4735]: E0223 00:11:21.858333 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\": container with ID starting with e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3 not found: ID does not exist" containerID="e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.858366 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3"} err="failed to get container status \"e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\": rpc error: code = NotFound desc = could not find container \"e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3\": container with ID starting with e96130823bfc8a19f99c0796f3d97fa53ff1911f88a537f6c0259c5e86e3d2c3 not found: ID does not exist" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.858391 4735 scope.go:117] "RemoveContainer" containerID="1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765" Feb 23 00:11:21 crc kubenswrapper[4735]: E0223 00:11:21.859665 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\": container with ID starting with 1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765 not found: ID does not exist" containerID="1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.859688 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765"} err="failed to get container status \"1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\": rpc error: code = NotFound desc = could not find container \"1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765\": container with ID starting with 1bd33b36a4dd933c4ce4c02a532e989aaa03052af9bb25b8364582e5e0b2d765 not found: ID does not exist" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.859706 4735 scope.go:117] "RemoveContainer" containerID="fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549" Feb 23 00:11:21 crc kubenswrapper[4735]: E0223 00:11:21.859977 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\": container with ID starting with fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549 not found: ID does not exist" containerID="fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.859996 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549"} err="failed to get container status \"fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\": rpc error: code = NotFound desc = could not find container \"fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549\": container with ID starting with fdc03eb2199e875a7040e48158a289974b3ba36b6632dd2eb0d6f6d802551549 not found: ID does not exist" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.860011 4735 scope.go:117] "RemoveContainer" containerID="fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f" Feb 23 00:11:21 crc kubenswrapper[4735]: E0223 00:11:21.860282 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\": container with ID starting with fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f not found: ID does not exist" containerID="fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.860304 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f"} err="failed to get container status \"fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\": rpc error: code = NotFound desc = could not find container \"fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f\": container with ID starting with fad1688ff0aeb1c81686324c8df8513e57301d4a97bf3a358e0634fda2ab785f not found: ID does not exist" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.860320 4735 scope.go:117] "RemoveContainer" containerID="3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676" Feb 23 00:11:21 crc kubenswrapper[4735]: E0223 00:11:21.861066 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\": container with ID starting with 3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676 not found: ID does not exist" containerID="3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676" Feb 23 00:11:21 crc kubenswrapper[4735]: I0223 00:11:21.861089 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676"} err="failed to get container status \"3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\": rpc error: code = NotFound desc = could not find container \"3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676\": container with ID starting with 3e32a642a4bdf726af7553692499e1b51d3ef4b1d6c1f873ca84f98f84649676 not found: ID does not exist" Feb 23 00:11:22 crc kubenswrapper[4735]: E0223 00:11:22.236665 4735 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 23 00:11:22 crc kubenswrapper[4735]: E0223 00:11:22.237053 4735 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 23 00:11:22 crc kubenswrapper[4735]: E0223 00:11:22.237559 4735 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 23 00:11:22 crc kubenswrapper[4735]: E0223 00:11:22.237999 4735 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 23 00:11:22 crc kubenswrapper[4735]: E0223 00:11:22.238541 4735 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 23 00:11:22 crc kubenswrapper[4735]: I0223 00:11:22.238600 4735 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 23 00:11:22 crc kubenswrapper[4735]: E0223 00:11:22.239106 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="200ms" Feb 23 00:11:22 crc kubenswrapper[4735]: I0223 00:11:22.273570 4735 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 23 00:11:22 crc kubenswrapper[4735]: I0223 00:11:22.274102 4735 status_manager.go:851] "Failed to get status for pod" podUID="5fd08331-c55b-4c64-b731-546ab66801e7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 23 00:11:22 crc kubenswrapper[4735]: I0223 00:11:22.278863 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 23 00:11:22 crc kubenswrapper[4735]: E0223 00:11:22.439470 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="400ms" Feb 23 00:11:22 crc kubenswrapper[4735]: E0223 00:11:22.840364 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="800ms" Feb 23 00:11:23 crc kubenswrapper[4735]: E0223 00:11:23.642105 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="1.6s" Feb 23 00:11:23 crc kubenswrapper[4735]: E0223 00:11:23.853521 4735 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.213:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 00:11:23 crc kubenswrapper[4735]: I0223 00:11:23.854164 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 00:11:23 crc kubenswrapper[4735]: W0223 00:11:23.876246 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-86fd6ce74f632c171267f5f7bbd9cdd9f8213f3ecbbabe7bcab0bdd3d8aadb29 WatchSource:0}: Error finding container 86fd6ce74f632c171267f5f7bbd9cdd9f8213f3ecbbabe7bcab0bdd3d8aadb29: Status 404 returned error can't find the container with id 86fd6ce74f632c171267f5f7bbd9cdd9f8213f3ecbbabe7bcab0bdd3d8aadb29 Feb 23 00:11:23 crc kubenswrapper[4735]: E0223 00:11:23.879639 4735 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.213:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1896b7b33db12746 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-23 00:11:23.878917958 +0000 UTC m=+242.342463969,LastTimestamp:2026-02-23 00:11:23.878917958 +0000 UTC m=+242.342463969,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 23 00:11:24 crc kubenswrapper[4735]: I0223 00:11:24.742416 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"ae51ea481a5ef75d04bcf7c2e9507cd10ee06c12dde39836d799b180e8c786bc"} Feb 23 00:11:24 crc kubenswrapper[4735]: I0223 00:11:24.742758 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"86fd6ce74f632c171267f5f7bbd9cdd9f8213f3ecbbabe7bcab0bdd3d8aadb29"} Feb 23 00:11:24 crc kubenswrapper[4735]: E0223 00:11:24.743304 4735 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.213:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 00:11:24 crc kubenswrapper[4735]: I0223 00:11:24.743652 4735 status_manager.go:851] "Failed to get status for pod" podUID="5fd08331-c55b-4c64-b731-546ab66801e7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 23 00:11:25 crc kubenswrapper[4735]: E0223 00:11:25.242946 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="3.2s" Feb 23 00:11:26 crc kubenswrapper[4735]: E0223 00:11:26.401266 4735 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.213:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1896b7b33db12746 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-23 00:11:23.878917958 +0000 UTC m=+242.342463969,LastTimestamp:2026-02-23 00:11:23.878917958 +0000 UTC m=+242.342463969,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 23 00:11:28 crc kubenswrapper[4735]: E0223 00:11:28.444445 4735 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" interval="6.4s" Feb 23 00:11:29 crc kubenswrapper[4735]: E0223 00:11:29.331554 4735 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.213:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" volumeName="registry-storage" Feb 23 00:11:32 crc kubenswrapper[4735]: I0223 00:11:32.277096 4735 status_manager.go:851] "Failed to get status for pod" podUID="5fd08331-c55b-4c64-b731-546ab66801e7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 23 00:11:33 crc kubenswrapper[4735]: I0223 00:11:33.272274 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:11:33 crc kubenswrapper[4735]: I0223 00:11:33.273265 4735 status_manager.go:851] "Failed to get status for pod" podUID="5fd08331-c55b-4c64-b731-546ab66801e7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 23 00:11:33 crc kubenswrapper[4735]: I0223 00:11:33.302392 4735 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="40cc14f8-4b0d-4268-82bc-e9c2d8073cf3" Feb 23 00:11:33 crc kubenswrapper[4735]: I0223 00:11:33.302440 4735 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="40cc14f8-4b0d-4268-82bc-e9c2d8073cf3" Feb 23 00:11:33 crc kubenswrapper[4735]: E0223 00:11:33.303049 4735 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:11:33 crc kubenswrapper[4735]: I0223 00:11:33.303645 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:11:33 crc kubenswrapper[4735]: E0223 00:11:33.315762 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:11:33Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:11:33Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:11:33Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-23T00:11:33Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 23 00:11:33 crc kubenswrapper[4735]: E0223 00:11:33.316390 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 23 00:11:33 crc kubenswrapper[4735]: E0223 00:11:33.317345 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 23 00:11:33 crc kubenswrapper[4735]: E0223 00:11:33.317990 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 23 00:11:33 crc kubenswrapper[4735]: E0223 00:11:33.318570 4735 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 23 00:11:33 crc kubenswrapper[4735]: E0223 00:11:33.318623 4735 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 23 00:11:33 crc kubenswrapper[4735]: W0223 00:11:33.338666 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-e46be3a61649e11cf25e088b494fdeb61b2d37fdba31a4e0aa71751d0a17989d WatchSource:0}: Error finding container e46be3a61649e11cf25e088b494fdeb61b2d37fdba31a4e0aa71751d0a17989d: Status 404 returned error can't find the container with id e46be3a61649e11cf25e088b494fdeb61b2d37fdba31a4e0aa71751d0a17989d Feb 23 00:11:33 crc kubenswrapper[4735]: I0223 00:11:33.802916 4735 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="ad7bcaf379464fd74609f83b8944deb36822a477bc5b5cce0daf7cf4bf1172ec" exitCode=0 Feb 23 00:11:33 crc kubenswrapper[4735]: I0223 00:11:33.803014 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"ad7bcaf379464fd74609f83b8944deb36822a477bc5b5cce0daf7cf4bf1172ec"} Feb 23 00:11:33 crc kubenswrapper[4735]: I0223 00:11:33.803386 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e46be3a61649e11cf25e088b494fdeb61b2d37fdba31a4e0aa71751d0a17989d"} Feb 23 00:11:33 crc kubenswrapper[4735]: I0223 00:11:33.803932 4735 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="40cc14f8-4b0d-4268-82bc-e9c2d8073cf3" Feb 23 00:11:33 crc kubenswrapper[4735]: I0223 00:11:33.803978 4735 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="40cc14f8-4b0d-4268-82bc-e9c2d8073cf3" Feb 23 00:11:33 crc kubenswrapper[4735]: I0223 00:11:33.804408 4735 status_manager.go:851] "Failed to get status for pod" podUID="5fd08331-c55b-4c64-b731-546ab66801e7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 23 00:11:33 crc kubenswrapper[4735]: E0223 00:11:33.804456 4735 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:11:33 crc kubenswrapper[4735]: I0223 00:11:33.808342 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 23 00:11:33 crc kubenswrapper[4735]: I0223 00:11:33.808418 4735 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406" exitCode=1 Feb 23 00:11:33 crc kubenswrapper[4735]: I0223 00:11:33.808456 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406"} Feb 23 00:11:33 crc kubenswrapper[4735]: I0223 00:11:33.809083 4735 scope.go:117] "RemoveContainer" containerID="a47ab0d8562bafd3080c29289e6cd496a9f76188960a3b6036cc044609876406" Feb 23 00:11:33 crc kubenswrapper[4735]: I0223 00:11:33.809946 4735 status_manager.go:851] "Failed to get status for pod" podUID="5fd08331-c55b-4c64-b731-546ab66801e7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 23 00:11:33 crc kubenswrapper[4735]: I0223 00:11:33.811549 4735 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.213:6443: connect: connection refused" Feb 23 00:11:34 crc kubenswrapper[4735]: I0223 00:11:34.182342 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 00:11:34 crc kubenswrapper[4735]: I0223 00:11:34.818193 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5a2c3d300a4271f2df7c778e4bce9621b27381037a93294a6793e37dd9436338"} Feb 23 00:11:34 crc kubenswrapper[4735]: I0223 00:11:34.818512 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f7ee7546ca276fa9ca5279f659774a1bad9148076aa9ecdb6ee772c68a487f6c"} Feb 23 00:11:34 crc kubenswrapper[4735]: I0223 00:11:34.823616 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 23 00:11:34 crc kubenswrapper[4735]: I0223 00:11:34.823686 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b784eaeb453cabf3f1fb254ac970a351a70f5af7e7788ab1f21d29ef4cf04b11"} Feb 23 00:11:35 crc kubenswrapper[4735]: I0223 00:11:35.834216 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ccc807e3e2fd6e442442c672cb66f2cf6783fdf51637459123f5f3542a00563c"} Feb 23 00:11:35 crc kubenswrapper[4735]: I0223 00:11:35.834629 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1bfdaaf8fbb74b785f850917abe7eb8e08ea86d90d2275e9a9538127245d8136"} Feb 23 00:11:35 crc kubenswrapper[4735]: I0223 00:11:35.834567 4735 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="40cc14f8-4b0d-4268-82bc-e9c2d8073cf3" Feb 23 00:11:35 crc kubenswrapper[4735]: I0223 00:11:35.834650 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9ab74bb7952745d7f26d61be02df813e827c6f1cd6a8eeedff9ce990fa3aebdc"} Feb 23 00:11:35 crc kubenswrapper[4735]: I0223 00:11:35.834665 4735 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="40cc14f8-4b0d-4268-82bc-e9c2d8073cf3" Feb 23 00:11:37 crc kubenswrapper[4735]: I0223 00:11:37.559485 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 00:11:37 crc kubenswrapper[4735]: I0223 00:11:37.566553 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 00:11:37 crc kubenswrapper[4735]: I0223 00:11:37.848689 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 00:11:38 crc kubenswrapper[4735]: I0223 00:11:38.303903 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:11:38 crc kubenswrapper[4735]: I0223 00:11:38.303955 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:11:38 crc kubenswrapper[4735]: I0223 00:11:38.311345 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:11:40 crc kubenswrapper[4735]: I0223 00:11:40.855285 4735 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:11:41 crc kubenswrapper[4735]: I0223 00:11:41.882911 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:11:41 crc kubenswrapper[4735]: I0223 00:11:41.882924 4735 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="40cc14f8-4b0d-4268-82bc-e9c2d8073cf3" Feb 23 00:11:41 crc kubenswrapper[4735]: I0223 00:11:41.882996 4735 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="40cc14f8-4b0d-4268-82bc-e9c2d8073cf3" Feb 23 00:11:41 crc kubenswrapper[4735]: I0223 00:11:41.889123 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:11:42 crc kubenswrapper[4735]: I0223 00:11:42.290371 4735 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="292f7a7d-1238-41fa-93e1-57dd88346af9" Feb 23 00:11:42 crc kubenswrapper[4735]: I0223 00:11:42.891372 4735 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="40cc14f8-4b0d-4268-82bc-e9c2d8073cf3" Feb 23 00:11:42 crc kubenswrapper[4735]: I0223 00:11:42.891414 4735 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="40cc14f8-4b0d-4268-82bc-e9c2d8073cf3" Feb 23 00:11:42 crc kubenswrapper[4735]: I0223 00:11:42.896380 4735 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="292f7a7d-1238-41fa-93e1-57dd88346af9" Feb 23 00:11:43 crc kubenswrapper[4735]: I0223 00:11:43.898229 4735 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="40cc14f8-4b0d-4268-82bc-e9c2d8073cf3" Feb 23 00:11:43 crc kubenswrapper[4735]: I0223 00:11:43.898274 4735 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="40cc14f8-4b0d-4268-82bc-e9c2d8073cf3" Feb 23 00:11:43 crc kubenswrapper[4735]: I0223 00:11:43.901900 4735 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="292f7a7d-1238-41fa-93e1-57dd88346af9" Feb 23 00:11:44 crc kubenswrapper[4735]: I0223 00:11:44.186548 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 23 00:11:50 crc kubenswrapper[4735]: I0223 00:11:50.503395 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 23 00:11:51 crc kubenswrapper[4735]: I0223 00:11:51.734059 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 23 00:11:51 crc kubenswrapper[4735]: I0223 00:11:51.753306 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 23 00:11:51 crc kubenswrapper[4735]: I0223 00:11:51.980781 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 23 00:11:52 crc kubenswrapper[4735]: I0223 00:11:52.167350 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 23 00:11:52 crc kubenswrapper[4735]: I0223 00:11:52.378814 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 23 00:11:52 crc kubenswrapper[4735]: I0223 00:11:52.493177 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 23 00:11:52 crc kubenswrapper[4735]: I0223 00:11:52.542390 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 23 00:11:52 crc kubenswrapper[4735]: I0223 00:11:52.804332 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 23 00:11:52 crc kubenswrapper[4735]: I0223 00:11:52.806613 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 23 00:11:52 crc kubenswrapper[4735]: I0223 00:11:52.890440 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 23 00:11:52 crc kubenswrapper[4735]: I0223 00:11:52.890450 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 23 00:11:53 crc kubenswrapper[4735]: I0223 00:11:53.098819 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 23 00:11:53 crc kubenswrapper[4735]: I0223 00:11:53.134542 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 23 00:11:53 crc kubenswrapper[4735]: I0223 00:11:53.336098 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 23 00:11:53 crc kubenswrapper[4735]: I0223 00:11:53.340934 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 23 00:11:53 crc kubenswrapper[4735]: I0223 00:11:53.504524 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 23 00:11:53 crc kubenswrapper[4735]: I0223 00:11:53.506755 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 23 00:11:53 crc kubenswrapper[4735]: I0223 00:11:53.543525 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 23 00:11:53 crc kubenswrapper[4735]: I0223 00:11:53.692415 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 23 00:11:53 crc kubenswrapper[4735]: I0223 00:11:53.696717 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 23 00:11:53 crc kubenswrapper[4735]: I0223 00:11:53.783030 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 23 00:11:53 crc kubenswrapper[4735]: I0223 00:11:53.917058 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 23 00:11:53 crc kubenswrapper[4735]: I0223 00:11:53.943597 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 23 00:11:53 crc kubenswrapper[4735]: I0223 00:11:53.997337 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 23 00:11:54 crc kubenswrapper[4735]: I0223 00:11:54.024517 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 23 00:11:54 crc kubenswrapper[4735]: I0223 00:11:54.128116 4735 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 23 00:11:54 crc kubenswrapper[4735]: I0223 00:11:54.264274 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 23 00:11:54 crc kubenswrapper[4735]: I0223 00:11:54.308107 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 23 00:11:54 crc kubenswrapper[4735]: I0223 00:11:54.313841 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 23 00:11:54 crc kubenswrapper[4735]: I0223 00:11:54.388457 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 23 00:11:54 crc kubenswrapper[4735]: I0223 00:11:54.439645 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 23 00:11:54 crc kubenswrapper[4735]: I0223 00:11:54.513924 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 23 00:11:54 crc kubenswrapper[4735]: I0223 00:11:54.526414 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 23 00:11:54 crc kubenswrapper[4735]: I0223 00:11:54.588891 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 23 00:11:54 crc kubenswrapper[4735]: I0223 00:11:54.765505 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 23 00:11:54 crc kubenswrapper[4735]: I0223 00:11:54.924375 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 23 00:11:54 crc kubenswrapper[4735]: I0223 00:11:54.928931 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 23 00:11:54 crc kubenswrapper[4735]: I0223 00:11:54.975051 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 23 00:11:55 crc kubenswrapper[4735]: I0223 00:11:55.131928 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 23 00:11:55 crc kubenswrapper[4735]: I0223 00:11:55.159467 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 23 00:11:55 crc kubenswrapper[4735]: I0223 00:11:55.183198 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 23 00:11:55 crc kubenswrapper[4735]: I0223 00:11:55.194131 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 23 00:11:55 crc kubenswrapper[4735]: I0223 00:11:55.214784 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 23 00:11:55 crc kubenswrapper[4735]: I0223 00:11:55.218479 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 23 00:11:55 crc kubenswrapper[4735]: I0223 00:11:55.358378 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 23 00:11:55 crc kubenswrapper[4735]: I0223 00:11:55.360676 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 23 00:11:55 crc kubenswrapper[4735]: I0223 00:11:55.361791 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 23 00:11:55 crc kubenswrapper[4735]: I0223 00:11:55.382434 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 23 00:11:55 crc kubenswrapper[4735]: I0223 00:11:55.504170 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 23 00:11:55 crc kubenswrapper[4735]: I0223 00:11:55.612839 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 23 00:11:55 crc kubenswrapper[4735]: I0223 00:11:55.634046 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 23 00:11:55 crc kubenswrapper[4735]: I0223 00:11:55.778293 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 23 00:11:55 crc kubenswrapper[4735]: I0223 00:11:55.845429 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 23 00:11:55 crc kubenswrapper[4735]: I0223 00:11:55.892952 4735 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 23 00:11:55 crc kubenswrapper[4735]: I0223 00:11:55.944418 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 23 00:11:55 crc kubenswrapper[4735]: I0223 00:11:55.981271 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 23 00:11:55 crc kubenswrapper[4735]: I0223 00:11:55.987339 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 23 00:11:56 crc kubenswrapper[4735]: I0223 00:11:56.060404 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 23 00:11:56 crc kubenswrapper[4735]: I0223 00:11:56.143744 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 23 00:11:56 crc kubenswrapper[4735]: I0223 00:11:56.191831 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 23 00:11:56 crc kubenswrapper[4735]: I0223 00:11:56.269263 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 23 00:11:56 crc kubenswrapper[4735]: I0223 00:11:56.410796 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 23 00:11:56 crc kubenswrapper[4735]: I0223 00:11:56.552293 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 23 00:11:56 crc kubenswrapper[4735]: I0223 00:11:56.573696 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 23 00:11:56 crc kubenswrapper[4735]: I0223 00:11:56.698922 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 23 00:11:56 crc kubenswrapper[4735]: I0223 00:11:56.701117 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 23 00:11:56 crc kubenswrapper[4735]: I0223 00:11:56.709369 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 23 00:11:56 crc kubenswrapper[4735]: I0223 00:11:56.721459 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 23 00:11:56 crc kubenswrapper[4735]: I0223 00:11:56.742508 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 23 00:11:56 crc kubenswrapper[4735]: I0223 00:11:56.779661 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 23 00:11:56 crc kubenswrapper[4735]: I0223 00:11:56.789038 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 23 00:11:56 crc kubenswrapper[4735]: I0223 00:11:56.844346 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 23 00:11:56 crc kubenswrapper[4735]: I0223 00:11:56.874563 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 23 00:11:57 crc kubenswrapper[4735]: I0223 00:11:57.034427 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 23 00:11:57 crc kubenswrapper[4735]: I0223 00:11:57.101477 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 23 00:11:57 crc kubenswrapper[4735]: I0223 00:11:57.225393 4735 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 23 00:11:57 crc kubenswrapper[4735]: I0223 00:11:57.227783 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 23 00:11:57 crc kubenswrapper[4735]: I0223 00:11:57.267262 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 23 00:11:57 crc kubenswrapper[4735]: I0223 00:11:57.303606 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 23 00:11:57 crc kubenswrapper[4735]: I0223 00:11:57.328145 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 23 00:11:57 crc kubenswrapper[4735]: I0223 00:11:57.360399 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 23 00:11:57 crc kubenswrapper[4735]: I0223 00:11:57.423708 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 23 00:11:57 crc kubenswrapper[4735]: I0223 00:11:57.509079 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 23 00:11:57 crc kubenswrapper[4735]: I0223 00:11:57.544556 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 23 00:11:57 crc kubenswrapper[4735]: I0223 00:11:57.546459 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 23 00:11:57 crc kubenswrapper[4735]: I0223 00:11:57.650715 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 23 00:11:57 crc kubenswrapper[4735]: I0223 00:11:57.654240 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 23 00:11:57 crc kubenswrapper[4735]: I0223 00:11:57.661685 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 23 00:11:57 crc kubenswrapper[4735]: I0223 00:11:57.690235 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 23 00:11:57 crc kubenswrapper[4735]: I0223 00:11:57.717012 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 23 00:11:57 crc kubenswrapper[4735]: I0223 00:11:57.785174 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 23 00:11:57 crc kubenswrapper[4735]: I0223 00:11:57.810964 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 23 00:11:57 crc kubenswrapper[4735]: I0223 00:11:57.825705 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 23 00:11:57 crc kubenswrapper[4735]: I0223 00:11:57.882081 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 23 00:11:57 crc kubenswrapper[4735]: I0223 00:11:57.889033 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 23 00:11:58 crc kubenswrapper[4735]: I0223 00:11:58.057727 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 23 00:11:58 crc kubenswrapper[4735]: I0223 00:11:58.162883 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 23 00:11:58 crc kubenswrapper[4735]: I0223 00:11:58.167043 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 23 00:11:58 crc kubenswrapper[4735]: I0223 00:11:58.181682 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 23 00:11:58 crc kubenswrapper[4735]: I0223 00:11:58.187713 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 23 00:11:58 crc kubenswrapper[4735]: I0223 00:11:58.222458 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 23 00:11:58 crc kubenswrapper[4735]: I0223 00:11:58.289517 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 23 00:11:58 crc kubenswrapper[4735]: I0223 00:11:58.341560 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 23 00:11:58 crc kubenswrapper[4735]: I0223 00:11:58.403438 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 23 00:11:58 crc kubenswrapper[4735]: I0223 00:11:58.498498 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 23 00:11:58 crc kubenswrapper[4735]: I0223 00:11:58.650793 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 23 00:11:58 crc kubenswrapper[4735]: I0223 00:11:58.671840 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 23 00:11:58 crc kubenswrapper[4735]: I0223 00:11:58.803616 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 23 00:11:58 crc kubenswrapper[4735]: I0223 00:11:58.810640 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 23 00:11:58 crc kubenswrapper[4735]: I0223 00:11:58.917049 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 23 00:11:59 crc kubenswrapper[4735]: I0223 00:11:59.187171 4735 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 23 00:11:59 crc kubenswrapper[4735]: I0223 00:11:59.188239 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 23 00:11:59 crc kubenswrapper[4735]: I0223 00:11:59.257936 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 23 00:11:59 crc kubenswrapper[4735]: I0223 00:11:59.326054 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 23 00:11:59 crc kubenswrapper[4735]: I0223 00:11:59.376753 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 23 00:11:59 crc kubenswrapper[4735]: I0223 00:11:59.509392 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 23 00:11:59 crc kubenswrapper[4735]: I0223 00:11:59.552187 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 23 00:11:59 crc kubenswrapper[4735]: I0223 00:11:59.579771 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 23 00:11:59 crc kubenswrapper[4735]: I0223 00:11:59.670254 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 23 00:11:59 crc kubenswrapper[4735]: I0223 00:11:59.671338 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 23 00:11:59 crc kubenswrapper[4735]: I0223 00:11:59.722289 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 23 00:11:59 crc kubenswrapper[4735]: I0223 00:11:59.775633 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 23 00:11:59 crc kubenswrapper[4735]: I0223 00:11:59.844711 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 23 00:11:59 crc kubenswrapper[4735]: I0223 00:11:59.863215 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 23 00:12:00 crc kubenswrapper[4735]: I0223 00:12:00.119520 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 23 00:12:00 crc kubenswrapper[4735]: I0223 00:12:00.152172 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 23 00:12:00 crc kubenswrapper[4735]: I0223 00:12:00.166833 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 23 00:12:00 crc kubenswrapper[4735]: I0223 00:12:00.182062 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 23 00:12:00 crc kubenswrapper[4735]: I0223 00:12:00.219828 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 23 00:12:00 crc kubenswrapper[4735]: I0223 00:12:00.270027 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 23 00:12:00 crc kubenswrapper[4735]: I0223 00:12:00.321766 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 23 00:12:00 crc kubenswrapper[4735]: I0223 00:12:00.377052 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 23 00:12:00 crc kubenswrapper[4735]: I0223 00:12:00.487202 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 23 00:12:00 crc kubenswrapper[4735]: I0223 00:12:00.537822 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 23 00:12:00 crc kubenswrapper[4735]: I0223 00:12:00.595035 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 23 00:12:00 crc kubenswrapper[4735]: I0223 00:12:00.598961 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 23 00:12:00 crc kubenswrapper[4735]: I0223 00:12:00.610528 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 23 00:12:00 crc kubenswrapper[4735]: I0223 00:12:00.675387 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 23 00:12:00 crc kubenswrapper[4735]: I0223 00:12:00.675494 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 23 00:12:00 crc kubenswrapper[4735]: I0223 00:12:00.686941 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 23 00:12:00 crc kubenswrapper[4735]: I0223 00:12:00.730722 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 23 00:12:00 crc kubenswrapper[4735]: I0223 00:12:00.911255 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 23 00:12:00 crc kubenswrapper[4735]: I0223 00:12:00.921929 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 23 00:12:01 crc kubenswrapper[4735]: I0223 00:12:01.000711 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 23 00:12:01 crc kubenswrapper[4735]: I0223 00:12:01.050930 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 23 00:12:01 crc kubenswrapper[4735]: I0223 00:12:01.092226 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 23 00:12:01 crc kubenswrapper[4735]: I0223 00:12:01.096679 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 23 00:12:01 crc kubenswrapper[4735]: I0223 00:12:01.154719 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 23 00:12:01 crc kubenswrapper[4735]: I0223 00:12:01.185313 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 23 00:12:01 crc kubenswrapper[4735]: I0223 00:12:01.225138 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 23 00:12:01 crc kubenswrapper[4735]: I0223 00:12:01.225635 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 23 00:12:01 crc kubenswrapper[4735]: I0223 00:12:01.283772 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 23 00:12:01 crc kubenswrapper[4735]: I0223 00:12:01.287822 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 23 00:12:01 crc kubenswrapper[4735]: I0223 00:12:01.339435 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 23 00:12:01 crc kubenswrapper[4735]: I0223 00:12:01.373651 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 23 00:12:01 crc kubenswrapper[4735]: I0223 00:12:01.387755 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 23 00:12:01 crc kubenswrapper[4735]: I0223 00:12:01.505885 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 23 00:12:01 crc kubenswrapper[4735]: I0223 00:12:01.518246 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 23 00:12:01 crc kubenswrapper[4735]: I0223 00:12:01.571821 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 23 00:12:01 crc kubenswrapper[4735]: I0223 00:12:01.619640 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 23 00:12:01 crc kubenswrapper[4735]: I0223 00:12:01.645600 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 23 00:12:01 crc kubenswrapper[4735]: I0223 00:12:01.649790 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 23 00:12:01 crc kubenswrapper[4735]: I0223 00:12:01.726409 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 23 00:12:01 crc kubenswrapper[4735]: I0223 00:12:01.775996 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 23 00:12:01 crc kubenswrapper[4735]: I0223 00:12:01.911199 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 23 00:12:01 crc kubenswrapper[4735]: I0223 00:12:01.930164 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 23 00:12:01 crc kubenswrapper[4735]: I0223 00:12:01.935630 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 23 00:12:02 crc kubenswrapper[4735]: I0223 00:12:02.037272 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 23 00:12:02 crc kubenswrapper[4735]: I0223 00:12:02.098950 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 23 00:12:02 crc kubenswrapper[4735]: I0223 00:12:02.105972 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 23 00:12:02 crc kubenswrapper[4735]: I0223 00:12:02.126974 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 23 00:12:02 crc kubenswrapper[4735]: I0223 00:12:02.138268 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 23 00:12:02 crc kubenswrapper[4735]: I0223 00:12:02.170513 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 23 00:12:02 crc kubenswrapper[4735]: I0223 00:12:02.206460 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 23 00:12:02 crc kubenswrapper[4735]: I0223 00:12:02.227533 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 23 00:12:02 crc kubenswrapper[4735]: I0223 00:12:02.354060 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 23 00:12:02 crc kubenswrapper[4735]: I0223 00:12:02.467324 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 23 00:12:02 crc kubenswrapper[4735]: I0223 00:12:02.515064 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 23 00:12:02 crc kubenswrapper[4735]: I0223 00:12:02.517864 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 23 00:12:02 crc kubenswrapper[4735]: I0223 00:12:02.522935 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 23 00:12:02 crc kubenswrapper[4735]: I0223 00:12:02.612290 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 23 00:12:02 crc kubenswrapper[4735]: I0223 00:12:02.618050 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 23 00:12:02 crc kubenswrapper[4735]: I0223 00:12:02.677732 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 23 00:12:02 crc kubenswrapper[4735]: I0223 00:12:02.801305 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 23 00:12:02 crc kubenswrapper[4735]: I0223 00:12:02.838940 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 23 00:12:02 crc kubenswrapper[4735]: I0223 00:12:02.852323 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 23 00:12:02 crc kubenswrapper[4735]: I0223 00:12:02.852669 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 23 00:12:02 crc kubenswrapper[4735]: I0223 00:12:02.889648 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 23 00:12:02 crc kubenswrapper[4735]: I0223 00:12:02.912481 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 23 00:12:02 crc kubenswrapper[4735]: I0223 00:12:02.956478 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 23 00:12:03 crc kubenswrapper[4735]: I0223 00:12:03.027816 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 23 00:12:03 crc kubenswrapper[4735]: I0223 00:12:03.035982 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 23 00:12:03 crc kubenswrapper[4735]: I0223 00:12:03.044207 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 23 00:12:03 crc kubenswrapper[4735]: I0223 00:12:03.052036 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 23 00:12:03 crc kubenswrapper[4735]: I0223 00:12:03.056815 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 23 00:12:03 crc kubenswrapper[4735]: I0223 00:12:03.090957 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 23 00:12:03 crc kubenswrapper[4735]: I0223 00:12:03.092544 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 23 00:12:03 crc kubenswrapper[4735]: I0223 00:12:03.143799 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 23 00:12:03 crc kubenswrapper[4735]: I0223 00:12:03.180775 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 23 00:12:03 crc kubenswrapper[4735]: I0223 00:12:03.186137 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 23 00:12:03 crc kubenswrapper[4735]: I0223 00:12:03.191926 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 23 00:12:03 crc kubenswrapper[4735]: I0223 00:12:03.205769 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 23 00:12:03 crc kubenswrapper[4735]: I0223 00:12:03.315600 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 23 00:12:03 crc kubenswrapper[4735]: I0223 00:12:03.425831 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 23 00:12:03 crc kubenswrapper[4735]: I0223 00:12:03.496789 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 23 00:12:03 crc kubenswrapper[4735]: I0223 00:12:03.532670 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 23 00:12:03 crc kubenswrapper[4735]: I0223 00:12:03.534313 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 23 00:12:03 crc kubenswrapper[4735]: I0223 00:12:03.598716 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 23 00:12:03 crc kubenswrapper[4735]: I0223 00:12:03.714668 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 23 00:12:03 crc kubenswrapper[4735]: I0223 00:12:03.734042 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 23 00:12:03 crc kubenswrapper[4735]: I0223 00:12:03.834043 4735 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 23 00:12:03 crc kubenswrapper[4735]: I0223 00:12:03.861563 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 23 00:12:03 crc kubenswrapper[4735]: I0223 00:12:03.886085 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 23 00:12:03 crc kubenswrapper[4735]: I0223 00:12:03.960128 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 23 00:12:03 crc kubenswrapper[4735]: I0223 00:12:03.965574 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 23 00:12:04 crc kubenswrapper[4735]: I0223 00:12:04.071833 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 23 00:12:04 crc kubenswrapper[4735]: I0223 00:12:04.174047 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 23 00:12:04 crc kubenswrapper[4735]: I0223 00:12:04.377013 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 23 00:12:04 crc kubenswrapper[4735]: I0223 00:12:04.444172 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 23 00:12:04 crc kubenswrapper[4735]: I0223 00:12:04.496034 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 23 00:12:04 crc kubenswrapper[4735]: I0223 00:12:04.529302 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 23 00:12:04 crc kubenswrapper[4735]: I0223 00:12:04.592230 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 23 00:12:04 crc kubenswrapper[4735]: I0223 00:12:04.601300 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 23 00:12:04 crc kubenswrapper[4735]: I0223 00:12:04.612061 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 23 00:12:04 crc kubenswrapper[4735]: I0223 00:12:04.672449 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 23 00:12:04 crc kubenswrapper[4735]: I0223 00:12:04.695683 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 23 00:12:04 crc kubenswrapper[4735]: I0223 00:12:04.722493 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 23 00:12:04 crc kubenswrapper[4735]: I0223 00:12:04.770600 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 23 00:12:04 crc kubenswrapper[4735]: I0223 00:12:04.787131 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 23 00:12:04 crc kubenswrapper[4735]: I0223 00:12:04.821139 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 23 00:12:04 crc kubenswrapper[4735]: I0223 00:12:04.952693 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 23 00:12:05 crc kubenswrapper[4735]: I0223 00:12:05.047052 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 23 00:12:05 crc kubenswrapper[4735]: I0223 00:12:05.148482 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 23 00:12:05 crc kubenswrapper[4735]: I0223 00:12:05.181351 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 23 00:12:05 crc kubenswrapper[4735]: I0223 00:12:05.425564 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 23 00:12:05 crc kubenswrapper[4735]: I0223 00:12:05.651529 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 23 00:12:05 crc kubenswrapper[4735]: I0223 00:12:05.744833 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 23 00:12:05 crc kubenswrapper[4735]: I0223 00:12:05.771688 4735 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 23 00:12:05 crc kubenswrapper[4735]: I0223 00:12:05.777047 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 23 00:12:05 crc kubenswrapper[4735]: I0223 00:12:05.777107 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 23 00:12:05 crc kubenswrapper[4735]: I0223 00:12:05.777130 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l4mrf","openshift-marketplace/redhat-operators-mvwmg","openshift-marketplace/redhat-marketplace-4gt4h","openshift-marketplace/marketplace-operator-79b997595-fpbc2","openshift-marketplace/certified-operators-72lxc"] Feb 23 00:12:05 crc kubenswrapper[4735]: I0223 00:12:05.777372 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-72lxc" podUID="f444dee5-d7dc-47ed-add6-b3b1148077f2" containerName="registry-server" containerID="cri-o://36d9ef46f2e7af08966bb2c5cb5cc959ce4264e4905d3d18d1086cd80d1525c5" gracePeriod=30 Feb 23 00:12:05 crc kubenswrapper[4735]: I0223 00:12:05.777487 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mvwmg" podUID="703af3dd-f895-4e96-991a-7e8a405bb03e" containerName="registry-server" containerID="cri-o://06b091925f2c8a3efa1932e7f2e45fede462a9935580ff5a44b489dc05f58547" gracePeriod=30 Feb 23 00:12:05 crc kubenswrapper[4735]: I0223 00:12:05.777642 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4gt4h" podUID="2bac7145-e696-4926-8d9a-de30ef0c6209" containerName="registry-server" containerID="cri-o://fe19d593922e6b68a3f0792c534718d5e727076776f91118b70de938dc4f2c14" gracePeriod=30 Feb 23 00:12:05 crc kubenswrapper[4735]: I0223 00:12:05.777788 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-fpbc2" podUID="58844231-adcc-497d-83a3-bba779038cc2" containerName="marketplace-operator" containerID="cri-o://37674b7dd11e993454cf9296464000d371864eadb768dd85e276bc12b726f400" gracePeriod=30 Feb 23 00:12:05 crc kubenswrapper[4735]: I0223 00:12:05.778144 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l4mrf" podUID="17e19891-1a63-4c0e-abd9-a161f96cf71e" containerName="registry-server" containerID="cri-o://aa2b5a46ca879fd3df78de833d19749769a40af921bd593f4596f7bec370eaee" gracePeriod=30 Feb 23 00:12:05 crc kubenswrapper[4735]: I0223 00:12:05.820923 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=25.820902418 podStartE2EDuration="25.820902418s" podCreationTimestamp="2026-02-23 00:11:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:12:05.816868058 +0000 UTC m=+284.280414049" watchObservedRunningTime="2026-02-23 00:12:05.820902418 +0000 UTC m=+284.284448399" Feb 23 00:12:05 crc kubenswrapper[4735]: I0223 00:12:05.858426 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 23 00:12:05 crc kubenswrapper[4735]: I0223 00:12:05.933161 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bcxr6"] Feb 23 00:12:05 crc kubenswrapper[4735]: E0223 00:12:05.933432 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fd08331-c55b-4c64-b731-546ab66801e7" containerName="installer" Feb 23 00:12:05 crc kubenswrapper[4735]: I0223 00:12:05.933448 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fd08331-c55b-4c64-b731-546ab66801e7" containerName="installer" Feb 23 00:12:05 crc kubenswrapper[4735]: I0223 00:12:05.933561 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fd08331-c55b-4c64-b731-546ab66801e7" containerName="installer" Feb 23 00:12:05 crc kubenswrapper[4735]: I0223 00:12:05.934033 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bcxr6" Feb 23 00:12:05 crc kubenswrapper[4735]: I0223 00:12:05.937704 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 23 00:12:05 crc kubenswrapper[4735]: I0223 00:12:05.951391 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bcxr6"] Feb 23 00:12:05 crc kubenswrapper[4735]: I0223 00:12:05.979560 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.061131 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7c59f527-6557-45fe-9bd0-78a30ba8da40-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bcxr6\" (UID: \"7c59f527-6557-45fe-9bd0-78a30ba8da40\") " pod="openshift-marketplace/marketplace-operator-79b997595-bcxr6" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.061198 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz64f\" (UniqueName: \"kubernetes.io/projected/7c59f527-6557-45fe-9bd0-78a30ba8da40-kube-api-access-xz64f\") pod \"marketplace-operator-79b997595-bcxr6\" (UID: \"7c59f527-6557-45fe-9bd0-78a30ba8da40\") " pod="openshift-marketplace/marketplace-operator-79b997595-bcxr6" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.061248 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7c59f527-6557-45fe-9bd0-78a30ba8da40-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bcxr6\" (UID: \"7c59f527-6557-45fe-9bd0-78a30ba8da40\") " pod="openshift-marketplace/marketplace-operator-79b997595-bcxr6" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.063710 4735 generic.go:334] "Generic (PLEG): container finished" podID="703af3dd-f895-4e96-991a-7e8a405bb03e" containerID="06b091925f2c8a3efa1932e7f2e45fede462a9935580ff5a44b489dc05f58547" exitCode=0 Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.063823 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvwmg" event={"ID":"703af3dd-f895-4e96-991a-7e8a405bb03e","Type":"ContainerDied","Data":"06b091925f2c8a3efa1932e7f2e45fede462a9935580ff5a44b489dc05f58547"} Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.065638 4735 generic.go:334] "Generic (PLEG): container finished" podID="17e19891-1a63-4c0e-abd9-a161f96cf71e" containerID="aa2b5a46ca879fd3df78de833d19749769a40af921bd593f4596f7bec370eaee" exitCode=0 Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.065711 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4mrf" event={"ID":"17e19891-1a63-4c0e-abd9-a161f96cf71e","Type":"ContainerDied","Data":"aa2b5a46ca879fd3df78de833d19749769a40af921bd593f4596f7bec370eaee"} Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.067339 4735 generic.go:334] "Generic (PLEG): container finished" podID="2bac7145-e696-4926-8d9a-de30ef0c6209" containerID="fe19d593922e6b68a3f0792c534718d5e727076776f91118b70de938dc4f2c14" exitCode=0 Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.067428 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4gt4h" event={"ID":"2bac7145-e696-4926-8d9a-de30ef0c6209","Type":"ContainerDied","Data":"fe19d593922e6b68a3f0792c534718d5e727076776f91118b70de938dc4f2c14"} Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.114325 4735 generic.go:334] "Generic (PLEG): container finished" podID="58844231-adcc-497d-83a3-bba779038cc2" containerID="37674b7dd11e993454cf9296464000d371864eadb768dd85e276bc12b726f400" exitCode=0 Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.114380 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fpbc2" event={"ID":"58844231-adcc-497d-83a3-bba779038cc2","Type":"ContainerDied","Data":"37674b7dd11e993454cf9296464000d371864eadb768dd85e276bc12b726f400"} Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.125277 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.129283 4735 generic.go:334] "Generic (PLEG): container finished" podID="f444dee5-d7dc-47ed-add6-b3b1148077f2" containerID="36d9ef46f2e7af08966bb2c5cb5cc959ce4264e4905d3d18d1086cd80d1525c5" exitCode=0 Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.129328 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72lxc" event={"ID":"f444dee5-d7dc-47ed-add6-b3b1148077f2","Type":"ContainerDied","Data":"36d9ef46f2e7af08966bb2c5cb5cc959ce4264e4905d3d18d1086cd80d1525c5"} Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.135245 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.161906 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7c59f527-6557-45fe-9bd0-78a30ba8da40-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bcxr6\" (UID: \"7c59f527-6557-45fe-9bd0-78a30ba8da40\") " pod="openshift-marketplace/marketplace-operator-79b997595-bcxr6" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.161950 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz64f\" (UniqueName: \"kubernetes.io/projected/7c59f527-6557-45fe-9bd0-78a30ba8da40-kube-api-access-xz64f\") pod \"marketplace-operator-79b997595-bcxr6\" (UID: \"7c59f527-6557-45fe-9bd0-78a30ba8da40\") " pod="openshift-marketplace/marketplace-operator-79b997595-bcxr6" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.161989 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7c59f527-6557-45fe-9bd0-78a30ba8da40-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bcxr6\" (UID: \"7c59f527-6557-45fe-9bd0-78a30ba8da40\") " pod="openshift-marketplace/marketplace-operator-79b997595-bcxr6" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.163090 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7c59f527-6557-45fe-9bd0-78a30ba8da40-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bcxr6\" (UID: \"7c59f527-6557-45fe-9bd0-78a30ba8da40\") " pod="openshift-marketplace/marketplace-operator-79b997595-bcxr6" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.170749 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/7c59f527-6557-45fe-9bd0-78a30ba8da40-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bcxr6\" (UID: \"7c59f527-6557-45fe-9bd0-78a30ba8da40\") " pod="openshift-marketplace/marketplace-operator-79b997595-bcxr6" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.182072 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz64f\" (UniqueName: \"kubernetes.io/projected/7c59f527-6557-45fe-9bd0-78a30ba8da40-kube-api-access-xz64f\") pod \"marketplace-operator-79b997595-bcxr6\" (UID: \"7c59f527-6557-45fe-9bd0-78a30ba8da40\") " pod="openshift-marketplace/marketplace-operator-79b997595-bcxr6" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.196395 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.220998 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l4mrf" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.354671 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.363653 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17e19891-1a63-4c0e-abd9-a161f96cf71e-utilities\") pod \"17e19891-1a63-4c0e-abd9-a161f96cf71e\" (UID: \"17e19891-1a63-4c0e-abd9-a161f96cf71e\") " Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.365670 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17e19891-1a63-4c0e-abd9-a161f96cf71e-utilities" (OuterVolumeSpecName: "utilities") pod "17e19891-1a63-4c0e-abd9-a161f96cf71e" (UID: "17e19891-1a63-4c0e-abd9-a161f96cf71e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.369070 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54h5v\" (UniqueName: \"kubernetes.io/projected/17e19891-1a63-4c0e-abd9-a161f96cf71e-kube-api-access-54h5v\") pod \"17e19891-1a63-4c0e-abd9-a161f96cf71e\" (UID: \"17e19891-1a63-4c0e-abd9-a161f96cf71e\") " Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.369431 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17e19891-1a63-4c0e-abd9-a161f96cf71e-catalog-content\") pod \"17e19891-1a63-4c0e-abd9-a161f96cf71e\" (UID: \"17e19891-1a63-4c0e-abd9-a161f96cf71e\") " Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.371072 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17e19891-1a63-4c0e-abd9-a161f96cf71e-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.375700 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17e19891-1a63-4c0e-abd9-a161f96cf71e-kube-api-access-54h5v" (OuterVolumeSpecName: "kube-api-access-54h5v") pod "17e19891-1a63-4c0e-abd9-a161f96cf71e" (UID: "17e19891-1a63-4c0e-abd9-a161f96cf71e"). InnerVolumeSpecName "kube-api-access-54h5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.376773 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4gt4h" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.420452 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bcxr6" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.452211 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17e19891-1a63-4c0e-abd9-a161f96cf71e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17e19891-1a63-4c0e-abd9-a161f96cf71e" (UID: "17e19891-1a63-4c0e-abd9-a161f96cf71e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.472450 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bac7145-e696-4926-8d9a-de30ef0c6209-utilities\") pod \"2bac7145-e696-4926-8d9a-de30ef0c6209\" (UID: \"2bac7145-e696-4926-8d9a-de30ef0c6209\") " Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.472566 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wz5sq\" (UniqueName: \"kubernetes.io/projected/2bac7145-e696-4926-8d9a-de30ef0c6209-kube-api-access-wz5sq\") pod \"2bac7145-e696-4926-8d9a-de30ef0c6209\" (UID: \"2bac7145-e696-4926-8d9a-de30ef0c6209\") " Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.472647 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bac7145-e696-4926-8d9a-de30ef0c6209-catalog-content\") pod \"2bac7145-e696-4926-8d9a-de30ef0c6209\" (UID: \"2bac7145-e696-4926-8d9a-de30ef0c6209\") " Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.472975 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54h5v\" (UniqueName: \"kubernetes.io/projected/17e19891-1a63-4c0e-abd9-a161f96cf71e-kube-api-access-54h5v\") on node \"crc\" DevicePath \"\"" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.472993 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17e19891-1a63-4c0e-abd9-a161f96cf71e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.475951 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bac7145-e696-4926-8d9a-de30ef0c6209-utilities" (OuterVolumeSpecName: "utilities") pod "2bac7145-e696-4926-8d9a-de30ef0c6209" (UID: "2bac7145-e696-4926-8d9a-de30ef0c6209"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.481599 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bac7145-e696-4926-8d9a-de30ef0c6209-kube-api-access-wz5sq" (OuterVolumeSpecName: "kube-api-access-wz5sq") pod "2bac7145-e696-4926-8d9a-de30ef0c6209" (UID: "2bac7145-e696-4926-8d9a-de30ef0c6209"). InnerVolumeSpecName "kube-api-access-wz5sq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.482436 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-72lxc" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.493378 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mvwmg" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.507491 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fpbc2" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.515019 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.524641 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bac7145-e696-4926-8d9a-de30ef0c6209-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2bac7145-e696-4926-8d9a-de30ef0c6209" (UID: "2bac7145-e696-4926-8d9a-de30ef0c6209"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.573667 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvdzx\" (UniqueName: \"kubernetes.io/projected/58844231-adcc-497d-83a3-bba779038cc2-kube-api-access-hvdzx\") pod \"58844231-adcc-497d-83a3-bba779038cc2\" (UID: \"58844231-adcc-497d-83a3-bba779038cc2\") " Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.573705 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f444dee5-d7dc-47ed-add6-b3b1148077f2-utilities\") pod \"f444dee5-d7dc-47ed-add6-b3b1148077f2\" (UID: \"f444dee5-d7dc-47ed-add6-b3b1148077f2\") " Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.573740 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/703af3dd-f895-4e96-991a-7e8a405bb03e-catalog-content\") pod \"703af3dd-f895-4e96-991a-7e8a405bb03e\" (UID: \"703af3dd-f895-4e96-991a-7e8a405bb03e\") " Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.573786 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzbsx\" (UniqueName: \"kubernetes.io/projected/703af3dd-f895-4e96-991a-7e8a405bb03e-kube-api-access-fzbsx\") pod \"703af3dd-f895-4e96-991a-7e8a405bb03e\" (UID: \"703af3dd-f895-4e96-991a-7e8a405bb03e\") " Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.573813 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/58844231-adcc-497d-83a3-bba779038cc2-marketplace-operator-metrics\") pod \"58844231-adcc-497d-83a3-bba779038cc2\" (UID: \"58844231-adcc-497d-83a3-bba779038cc2\") " Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.575349 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f444dee5-d7dc-47ed-add6-b3b1148077f2-utilities" (OuterVolumeSpecName: "utilities") pod "f444dee5-d7dc-47ed-add6-b3b1148077f2" (UID: "f444dee5-d7dc-47ed-add6-b3b1148077f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.578683 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kfbt\" (UniqueName: \"kubernetes.io/projected/f444dee5-d7dc-47ed-add6-b3b1148077f2-kube-api-access-4kfbt\") pod \"f444dee5-d7dc-47ed-add6-b3b1148077f2\" (UID: \"f444dee5-d7dc-47ed-add6-b3b1148077f2\") " Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.578747 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/703af3dd-f895-4e96-991a-7e8a405bb03e-utilities\") pod \"703af3dd-f895-4e96-991a-7e8a405bb03e\" (UID: \"703af3dd-f895-4e96-991a-7e8a405bb03e\") " Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.578796 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58844231-adcc-497d-83a3-bba779038cc2-marketplace-trusted-ca\") pod \"58844231-adcc-497d-83a3-bba779038cc2\" (UID: \"58844231-adcc-497d-83a3-bba779038cc2\") " Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.578828 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f444dee5-d7dc-47ed-add6-b3b1148077f2-catalog-content\") pod \"f444dee5-d7dc-47ed-add6-b3b1148077f2\" (UID: \"f444dee5-d7dc-47ed-add6-b3b1148077f2\") " Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.579166 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bac7145-e696-4926-8d9a-de30ef0c6209-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.579178 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f444dee5-d7dc-47ed-add6-b3b1148077f2-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.579188 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bac7145-e696-4926-8d9a-de30ef0c6209-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.579198 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wz5sq\" (UniqueName: \"kubernetes.io/projected/2bac7145-e696-4926-8d9a-de30ef0c6209-kube-api-access-wz5sq\") on node \"crc\" DevicePath \"\"" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.579999 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58844231-adcc-497d-83a3-bba779038cc2-kube-api-access-hvdzx" (OuterVolumeSpecName: "kube-api-access-hvdzx") pod "58844231-adcc-497d-83a3-bba779038cc2" (UID: "58844231-adcc-497d-83a3-bba779038cc2"). InnerVolumeSpecName "kube-api-access-hvdzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.580482 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/703af3dd-f895-4e96-991a-7e8a405bb03e-utilities" (OuterVolumeSpecName: "utilities") pod "703af3dd-f895-4e96-991a-7e8a405bb03e" (UID: "703af3dd-f895-4e96-991a-7e8a405bb03e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.580654 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/703af3dd-f895-4e96-991a-7e8a405bb03e-kube-api-access-fzbsx" (OuterVolumeSpecName: "kube-api-access-fzbsx") pod "703af3dd-f895-4e96-991a-7e8a405bb03e" (UID: "703af3dd-f895-4e96-991a-7e8a405bb03e"). InnerVolumeSpecName "kube-api-access-fzbsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.580679 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58844231-adcc-497d-83a3-bba779038cc2-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "58844231-adcc-497d-83a3-bba779038cc2" (UID: "58844231-adcc-497d-83a3-bba779038cc2"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.582259 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58844231-adcc-497d-83a3-bba779038cc2-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "58844231-adcc-497d-83a3-bba779038cc2" (UID: "58844231-adcc-497d-83a3-bba779038cc2"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.582739 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f444dee5-d7dc-47ed-add6-b3b1148077f2-kube-api-access-4kfbt" (OuterVolumeSpecName: "kube-api-access-4kfbt") pod "f444dee5-d7dc-47ed-add6-b3b1148077f2" (UID: "f444dee5-d7dc-47ed-add6-b3b1148077f2"). InnerVolumeSpecName "kube-api-access-4kfbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.634556 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.648253 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f444dee5-d7dc-47ed-add6-b3b1148077f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f444dee5-d7dc-47ed-add6-b3b1148077f2" (UID: "f444dee5-d7dc-47ed-add6-b3b1148077f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.678370 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.680882 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzbsx\" (UniqueName: \"kubernetes.io/projected/703af3dd-f895-4e96-991a-7e8a405bb03e-kube-api-access-fzbsx\") on node \"crc\" DevicePath \"\"" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.680918 4735 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/58844231-adcc-497d-83a3-bba779038cc2-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.680931 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kfbt\" (UniqueName: \"kubernetes.io/projected/f444dee5-d7dc-47ed-add6-b3b1148077f2-kube-api-access-4kfbt\") on node \"crc\" DevicePath \"\"" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.680968 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/703af3dd-f895-4e96-991a-7e8a405bb03e-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.680981 4735 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58844231-adcc-497d-83a3-bba779038cc2-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.680992 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f444dee5-d7dc-47ed-add6-b3b1148077f2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.681002 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvdzx\" (UniqueName: \"kubernetes.io/projected/58844231-adcc-497d-83a3-bba779038cc2-kube-api-access-hvdzx\") on node \"crc\" DevicePath \"\"" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.716487 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/703af3dd-f895-4e96-991a-7e8a405bb03e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "703af3dd-f895-4e96-991a-7e8a405bb03e" (UID: "703af3dd-f895-4e96-991a-7e8a405bb03e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:12:06 crc kubenswrapper[4735]: I0223 00:12:06.782525 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/703af3dd-f895-4e96-991a-7e8a405bb03e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 00:12:07 crc kubenswrapper[4735]: I0223 00:12:07.135465 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4gt4h" event={"ID":"2bac7145-e696-4926-8d9a-de30ef0c6209","Type":"ContainerDied","Data":"ef39aefa5580de7ff0a0adf13684d3abbb74b26cbcdd72e216b62bb63dd77f8d"} Feb 23 00:12:07 crc kubenswrapper[4735]: I0223 00:12:07.135697 4735 scope.go:117] "RemoveContainer" containerID="fe19d593922e6b68a3f0792c534718d5e727076776f91118b70de938dc4f2c14" Feb 23 00:12:07 crc kubenswrapper[4735]: I0223 00:12:07.135801 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4gt4h" Feb 23 00:12:07 crc kubenswrapper[4735]: I0223 00:12:07.146096 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-fpbc2" event={"ID":"58844231-adcc-497d-83a3-bba779038cc2","Type":"ContainerDied","Data":"f01338ae3147e30c930dd481e60983ff4a7bcc329158503b8e935162d422a10d"} Feb 23 00:12:07 crc kubenswrapper[4735]: I0223 00:12:07.146180 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-fpbc2" Feb 23 00:12:07 crc kubenswrapper[4735]: I0223 00:12:07.150573 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72lxc" event={"ID":"f444dee5-d7dc-47ed-add6-b3b1148077f2","Type":"ContainerDied","Data":"a8e9a0524209a9712f0f446ee419d2546d688d639a6737975c44ffa269457568"} Feb 23 00:12:07 crc kubenswrapper[4735]: I0223 00:12:07.150638 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-72lxc" Feb 23 00:12:07 crc kubenswrapper[4735]: I0223 00:12:07.153603 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvwmg" event={"ID":"703af3dd-f895-4e96-991a-7e8a405bb03e","Type":"ContainerDied","Data":"fefd497998a02bb20028a7db5e8b44cccfe20ad226dcb747e963b36259a35adc"} Feb 23 00:12:07 crc kubenswrapper[4735]: I0223 00:12:07.153697 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mvwmg" Feb 23 00:12:07 crc kubenswrapper[4735]: I0223 00:12:07.159612 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4mrf" event={"ID":"17e19891-1a63-4c0e-abd9-a161f96cf71e","Type":"ContainerDied","Data":"bce3cf1157e9325a201341711515fc1ac739dac84d8b64a31318d22f03be1e03"} Feb 23 00:12:07 crc kubenswrapper[4735]: I0223 00:12:07.159693 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l4mrf" Feb 23 00:12:07 crc kubenswrapper[4735]: I0223 00:12:07.475172 4735 scope.go:117] "RemoveContainer" containerID="0021d78dd99885ce26ea1bdf7b7df2e60ab9fb2d365126de48066237a4d887de" Feb 23 00:12:07 crc kubenswrapper[4735]: I0223 00:12:07.475252 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 23 00:12:07 crc kubenswrapper[4735]: I0223 00:12:07.500508 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mvwmg"] Feb 23 00:12:07 crc kubenswrapper[4735]: I0223 00:12:07.504134 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mvwmg"] Feb 23 00:12:07 crc kubenswrapper[4735]: I0223 00:12:07.508182 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bcxr6"] Feb 23 00:12:07 crc kubenswrapper[4735]: I0223 00:12:07.508330 4735 scope.go:117] "RemoveContainer" containerID="a6802770d7830967bfb75e75172a24a6d189aece352443af0e4e72246b51ab73" Feb 23 00:12:07 crc kubenswrapper[4735]: I0223 00:12:07.511713 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4gt4h"] Feb 23 00:12:07 crc kubenswrapper[4735]: I0223 00:12:07.516704 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4gt4h"] Feb 23 00:12:07 crc kubenswrapper[4735]: I0223 00:12:07.525407 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fpbc2"] Feb 23 00:12:07 crc kubenswrapper[4735]: I0223 00:12:07.530985 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-fpbc2"] Feb 23 00:12:07 crc kubenswrapper[4735]: I0223 00:12:07.539305 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-72lxc"] Feb 23 00:12:07 crc kubenswrapper[4735]: I0223 00:12:07.542936 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-72lxc"] Feb 23 00:12:07 crc kubenswrapper[4735]: I0223 00:12:07.548462 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l4mrf"] Feb 23 00:12:07 crc kubenswrapper[4735]: I0223 00:12:07.550479 4735 scope.go:117] "RemoveContainer" containerID="37674b7dd11e993454cf9296464000d371864eadb768dd85e276bc12b726f400" Feb 23 00:12:07 crc kubenswrapper[4735]: I0223 00:12:07.551452 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l4mrf"] Feb 23 00:12:07 crc kubenswrapper[4735]: I0223 00:12:07.580229 4735 scope.go:117] "RemoveContainer" containerID="36d9ef46f2e7af08966bb2c5cb5cc959ce4264e4905d3d18d1086cd80d1525c5" Feb 23 00:12:07 crc kubenswrapper[4735]: I0223 00:12:07.599800 4735 scope.go:117] "RemoveContainer" containerID="4f58202dfbd0973b3c8d29e475d56037577288f1deaa28c554a85c951a7a11c3" Feb 23 00:12:07 crc kubenswrapper[4735]: I0223 00:12:07.627465 4735 scope.go:117] "RemoveContainer" containerID="3dbc5778e6645d9d173241cf310187e13a4d5d431d19e3ba8b7a95dd2fdea9be" Feb 23 00:12:07 crc kubenswrapper[4735]: I0223 00:12:07.646106 4735 scope.go:117] "RemoveContainer" containerID="06b091925f2c8a3efa1932e7f2e45fede462a9935580ff5a44b489dc05f58547" Feb 23 00:12:07 crc kubenswrapper[4735]: I0223 00:12:07.665456 4735 scope.go:117] "RemoveContainer" containerID="1e42f25e19e712b80a9bf6a46633d1adb8ee6662e617935da0a81cff57384d3f" Feb 23 00:12:07 crc kubenswrapper[4735]: I0223 00:12:07.680678 4735 scope.go:117] "RemoveContainer" containerID="ff364cb769ee3e3749210e14bd4be3e6cddec74f6ae80a77511f10d6b65772d8" Feb 23 00:12:07 crc kubenswrapper[4735]: I0223 00:12:07.695752 4735 scope.go:117] "RemoveContainer" containerID="aa2b5a46ca879fd3df78de833d19749769a40af921bd593f4596f7bec370eaee" Feb 23 00:12:07 crc kubenswrapper[4735]: I0223 00:12:07.710125 4735 scope.go:117] "RemoveContainer" containerID="faea45e90bd87cce8b67e596b7caa61f18b65de0612d204053276442c1412e0d" Feb 23 00:12:07 crc kubenswrapper[4735]: I0223 00:12:07.728900 4735 scope.go:117] "RemoveContainer" containerID="2906412ded4239feec64b43dd70a9c33432d07e9bf8e28608c56606e91821364" Feb 23 00:12:08 crc kubenswrapper[4735]: I0223 00:12:08.015417 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 23 00:12:08 crc kubenswrapper[4735]: I0223 00:12:08.169624 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bcxr6" event={"ID":"7c59f527-6557-45fe-9bd0-78a30ba8da40","Type":"ContainerStarted","Data":"9ddca8a114f765d57b28147ed1457152bdd76ade56341161996913614aeffca0"} Feb 23 00:12:08 crc kubenswrapper[4735]: I0223 00:12:08.169688 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bcxr6" event={"ID":"7c59f527-6557-45fe-9bd0-78a30ba8da40","Type":"ContainerStarted","Data":"ee65c632bb070de05eebdfe22b60d7a7012edb99244aa1a1837af04e872f5e47"} Feb 23 00:12:08 crc kubenswrapper[4735]: I0223 00:12:08.170079 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-bcxr6" Feb 23 00:12:08 crc kubenswrapper[4735]: I0223 00:12:08.173218 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-bcxr6" Feb 23 00:12:08 crc kubenswrapper[4735]: I0223 00:12:08.201521 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-bcxr6" podStartSLOduration=3.201489232 podStartE2EDuration="3.201489232s" podCreationTimestamp="2026-02-23 00:12:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:12:08.185450363 +0000 UTC m=+286.648996364" watchObservedRunningTime="2026-02-23 00:12:08.201489232 +0000 UTC m=+286.665035243" Feb 23 00:12:08 crc kubenswrapper[4735]: I0223 00:12:08.280409 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17e19891-1a63-4c0e-abd9-a161f96cf71e" path="/var/lib/kubelet/pods/17e19891-1a63-4c0e-abd9-a161f96cf71e/volumes" Feb 23 00:12:08 crc kubenswrapper[4735]: I0223 00:12:08.281288 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bac7145-e696-4926-8d9a-de30ef0c6209" path="/var/lib/kubelet/pods/2bac7145-e696-4926-8d9a-de30ef0c6209/volumes" Feb 23 00:12:08 crc kubenswrapper[4735]: I0223 00:12:08.282409 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58844231-adcc-497d-83a3-bba779038cc2" path="/var/lib/kubelet/pods/58844231-adcc-497d-83a3-bba779038cc2/volumes" Feb 23 00:12:08 crc kubenswrapper[4735]: I0223 00:12:08.285252 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="703af3dd-f895-4e96-991a-7e8a405bb03e" path="/var/lib/kubelet/pods/703af3dd-f895-4e96-991a-7e8a405bb03e/volumes" Feb 23 00:12:08 crc kubenswrapper[4735]: I0223 00:12:08.286677 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f444dee5-d7dc-47ed-add6-b3b1148077f2" path="/var/lib/kubelet/pods/f444dee5-d7dc-47ed-add6-b3b1148077f2/volumes" Feb 23 00:12:13 crc kubenswrapper[4735]: I0223 00:12:13.310634 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 23 00:12:13 crc kubenswrapper[4735]: I0223 00:12:13.628612 4735 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 23 00:12:13 crc kubenswrapper[4735]: I0223 00:12:13.628820 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://ae51ea481a5ef75d04bcf7c2e9507cd10ee06c12dde39836d799b180e8c786bc" gracePeriod=5 Feb 23 00:12:19 crc kubenswrapper[4735]: I0223 00:12:19.237758 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 23 00:12:19 crc kubenswrapper[4735]: I0223 00:12:19.238628 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 00:12:19 crc kubenswrapper[4735]: I0223 00:12:19.240063 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 23 00:12:19 crc kubenswrapper[4735]: I0223 00:12:19.240124 4735 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="ae51ea481a5ef75d04bcf7c2e9507cd10ee06c12dde39836d799b180e8c786bc" exitCode=137 Feb 23 00:12:19 crc kubenswrapper[4735]: I0223 00:12:19.240181 4735 scope.go:117] "RemoveContainer" containerID="ae51ea481a5ef75d04bcf7c2e9507cd10ee06c12dde39836d799b180e8c786bc" Feb 23 00:12:19 crc kubenswrapper[4735]: I0223 00:12:19.262478 4735 scope.go:117] "RemoveContainer" containerID="ae51ea481a5ef75d04bcf7c2e9507cd10ee06c12dde39836d799b180e8c786bc" Feb 23 00:12:19 crc kubenswrapper[4735]: E0223 00:12:19.263012 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae51ea481a5ef75d04bcf7c2e9507cd10ee06c12dde39836d799b180e8c786bc\": container with ID starting with ae51ea481a5ef75d04bcf7c2e9507cd10ee06c12dde39836d799b180e8c786bc not found: ID does not exist" containerID="ae51ea481a5ef75d04bcf7c2e9507cd10ee06c12dde39836d799b180e8c786bc" Feb 23 00:12:19 crc kubenswrapper[4735]: I0223 00:12:19.263136 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae51ea481a5ef75d04bcf7c2e9507cd10ee06c12dde39836d799b180e8c786bc"} err="failed to get container status \"ae51ea481a5ef75d04bcf7c2e9507cd10ee06c12dde39836d799b180e8c786bc\": rpc error: code = NotFound desc = could not find container \"ae51ea481a5ef75d04bcf7c2e9507cd10ee06c12dde39836d799b180e8c786bc\": container with ID starting with ae51ea481a5ef75d04bcf7c2e9507cd10ee06c12dde39836d799b180e8c786bc not found: ID does not exist" Feb 23 00:12:19 crc kubenswrapper[4735]: I0223 00:12:19.431539 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 23 00:12:19 crc kubenswrapper[4735]: I0223 00:12:19.431602 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 23 00:12:19 crc kubenswrapper[4735]: I0223 00:12:19.431674 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 00:12:19 crc kubenswrapper[4735]: I0223 00:12:19.431758 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 23 00:12:19 crc kubenswrapper[4735]: I0223 00:12:19.431788 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 23 00:12:19 crc kubenswrapper[4735]: I0223 00:12:19.431837 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 00:12:19 crc kubenswrapper[4735]: I0223 00:12:19.431882 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 23 00:12:19 crc kubenswrapper[4735]: I0223 00:12:19.431952 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 00:12:19 crc kubenswrapper[4735]: I0223 00:12:19.431966 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 00:12:19 crc kubenswrapper[4735]: I0223 00:12:19.432315 4735 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 23 00:12:19 crc kubenswrapper[4735]: I0223 00:12:19.432338 4735 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 23 00:12:19 crc kubenswrapper[4735]: I0223 00:12:19.432357 4735 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 23 00:12:19 crc kubenswrapper[4735]: I0223 00:12:19.432375 4735 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 23 00:12:19 crc kubenswrapper[4735]: I0223 00:12:19.447071 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 00:12:19 crc kubenswrapper[4735]: I0223 00:12:19.534280 4735 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 23 00:12:20 crc kubenswrapper[4735]: I0223 00:12:20.246648 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 23 00:12:20 crc kubenswrapper[4735]: I0223 00:12:20.279670 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 23 00:12:22 crc kubenswrapper[4735]: I0223 00:12:22.029712 4735 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 23 00:12:24 crc kubenswrapper[4735]: I0223 00:12:24.815107 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 23 00:12:41 crc kubenswrapper[4735]: I0223 00:12:41.789217 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6xh6c"] Feb 23 00:12:41 crc kubenswrapper[4735]: E0223 00:12:41.789994 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bac7145-e696-4926-8d9a-de30ef0c6209" containerName="registry-server" Feb 23 00:12:41 crc kubenswrapper[4735]: I0223 00:12:41.790008 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bac7145-e696-4926-8d9a-de30ef0c6209" containerName="registry-server" Feb 23 00:12:41 crc kubenswrapper[4735]: E0223 00:12:41.790016 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17e19891-1a63-4c0e-abd9-a161f96cf71e" containerName="extract-utilities" Feb 23 00:12:41 crc kubenswrapper[4735]: I0223 00:12:41.790022 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="17e19891-1a63-4c0e-abd9-a161f96cf71e" containerName="extract-utilities" Feb 23 00:12:41 crc kubenswrapper[4735]: E0223 00:12:41.790034 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 23 00:12:41 crc kubenswrapper[4735]: I0223 00:12:41.790041 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 23 00:12:41 crc kubenswrapper[4735]: E0223 00:12:41.790052 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17e19891-1a63-4c0e-abd9-a161f96cf71e" containerName="registry-server" Feb 23 00:12:41 crc kubenswrapper[4735]: I0223 00:12:41.790059 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="17e19891-1a63-4c0e-abd9-a161f96cf71e" containerName="registry-server" Feb 23 00:12:41 crc kubenswrapper[4735]: E0223 00:12:41.790070 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="703af3dd-f895-4e96-991a-7e8a405bb03e" containerName="extract-utilities" Feb 23 00:12:41 crc kubenswrapper[4735]: I0223 00:12:41.790077 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="703af3dd-f895-4e96-991a-7e8a405bb03e" containerName="extract-utilities" Feb 23 00:12:41 crc kubenswrapper[4735]: E0223 00:12:41.790089 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17e19891-1a63-4c0e-abd9-a161f96cf71e" containerName="extract-content" Feb 23 00:12:41 crc kubenswrapper[4735]: I0223 00:12:41.790096 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="17e19891-1a63-4c0e-abd9-a161f96cf71e" containerName="extract-content" Feb 23 00:12:41 crc kubenswrapper[4735]: E0223 00:12:41.790104 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58844231-adcc-497d-83a3-bba779038cc2" containerName="marketplace-operator" Feb 23 00:12:41 crc kubenswrapper[4735]: I0223 00:12:41.790112 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="58844231-adcc-497d-83a3-bba779038cc2" containerName="marketplace-operator" Feb 23 00:12:41 crc kubenswrapper[4735]: E0223 00:12:41.790125 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bac7145-e696-4926-8d9a-de30ef0c6209" containerName="extract-utilities" Feb 23 00:12:41 crc kubenswrapper[4735]: I0223 00:12:41.790132 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bac7145-e696-4926-8d9a-de30ef0c6209" containerName="extract-utilities" Feb 23 00:12:41 crc kubenswrapper[4735]: E0223 00:12:41.790145 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="703af3dd-f895-4e96-991a-7e8a405bb03e" containerName="registry-server" Feb 23 00:12:41 crc kubenswrapper[4735]: I0223 00:12:41.790152 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="703af3dd-f895-4e96-991a-7e8a405bb03e" containerName="registry-server" Feb 23 00:12:41 crc kubenswrapper[4735]: E0223 00:12:41.790162 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="703af3dd-f895-4e96-991a-7e8a405bb03e" containerName="extract-content" Feb 23 00:12:41 crc kubenswrapper[4735]: I0223 00:12:41.790170 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="703af3dd-f895-4e96-991a-7e8a405bb03e" containerName="extract-content" Feb 23 00:12:41 crc kubenswrapper[4735]: E0223 00:12:41.790181 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f444dee5-d7dc-47ed-add6-b3b1148077f2" containerName="extract-content" Feb 23 00:12:41 crc kubenswrapper[4735]: I0223 00:12:41.790188 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f444dee5-d7dc-47ed-add6-b3b1148077f2" containerName="extract-content" Feb 23 00:12:41 crc kubenswrapper[4735]: E0223 00:12:41.790196 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f444dee5-d7dc-47ed-add6-b3b1148077f2" containerName="extract-utilities" Feb 23 00:12:41 crc kubenswrapper[4735]: I0223 00:12:41.790202 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f444dee5-d7dc-47ed-add6-b3b1148077f2" containerName="extract-utilities" Feb 23 00:12:41 crc kubenswrapper[4735]: E0223 00:12:41.790211 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bac7145-e696-4926-8d9a-de30ef0c6209" containerName="extract-content" Feb 23 00:12:41 crc kubenswrapper[4735]: I0223 00:12:41.790217 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bac7145-e696-4926-8d9a-de30ef0c6209" containerName="extract-content" Feb 23 00:12:41 crc kubenswrapper[4735]: E0223 00:12:41.790223 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f444dee5-d7dc-47ed-add6-b3b1148077f2" containerName="registry-server" Feb 23 00:12:41 crc kubenswrapper[4735]: I0223 00:12:41.790228 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f444dee5-d7dc-47ed-add6-b3b1148077f2" containerName="registry-server" Feb 23 00:12:41 crc kubenswrapper[4735]: I0223 00:12:41.790309 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bac7145-e696-4926-8d9a-de30ef0c6209" containerName="registry-server" Feb 23 00:12:41 crc kubenswrapper[4735]: I0223 00:12:41.790320 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 23 00:12:41 crc kubenswrapper[4735]: I0223 00:12:41.790328 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="58844231-adcc-497d-83a3-bba779038cc2" containerName="marketplace-operator" Feb 23 00:12:41 crc kubenswrapper[4735]: I0223 00:12:41.790334 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="703af3dd-f895-4e96-991a-7e8a405bb03e" containerName="registry-server" Feb 23 00:12:41 crc kubenswrapper[4735]: I0223 00:12:41.790342 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f444dee5-d7dc-47ed-add6-b3b1148077f2" containerName="registry-server" Feb 23 00:12:41 crc kubenswrapper[4735]: I0223 00:12:41.790352 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="17e19891-1a63-4c0e-abd9-a161f96cf71e" containerName="registry-server" Feb 23 00:12:41 crc kubenswrapper[4735]: I0223 00:12:41.791022 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6xh6c" Feb 23 00:12:41 crc kubenswrapper[4735]: I0223 00:12:41.794346 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 23 00:12:41 crc kubenswrapper[4735]: I0223 00:12:41.804834 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6xh6c"] Feb 23 00:12:41 crc kubenswrapper[4735]: I0223 00:12:41.955775 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2256\" (UniqueName: \"kubernetes.io/projected/faa0a6f2-8a6e-4c82-b876-9cc1cff42496-kube-api-access-c2256\") pod \"redhat-marketplace-6xh6c\" (UID: \"faa0a6f2-8a6e-4c82-b876-9cc1cff42496\") " pod="openshift-marketplace/redhat-marketplace-6xh6c" Feb 23 00:12:41 crc kubenswrapper[4735]: I0223 00:12:41.955954 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faa0a6f2-8a6e-4c82-b876-9cc1cff42496-utilities\") pod \"redhat-marketplace-6xh6c\" (UID: \"faa0a6f2-8a6e-4c82-b876-9cc1cff42496\") " pod="openshift-marketplace/redhat-marketplace-6xh6c" Feb 23 00:12:41 crc kubenswrapper[4735]: I0223 00:12:41.956003 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faa0a6f2-8a6e-4c82-b876-9cc1cff42496-catalog-content\") pod \"redhat-marketplace-6xh6c\" (UID: \"faa0a6f2-8a6e-4c82-b876-9cc1cff42496\") " pod="openshift-marketplace/redhat-marketplace-6xh6c" Feb 23 00:12:41 crc kubenswrapper[4735]: I0223 00:12:41.987940 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4qlrn"] Feb 23 00:12:41 crc kubenswrapper[4735]: I0223 00:12:41.989844 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4qlrn" Feb 23 00:12:41 crc kubenswrapper[4735]: I0223 00:12:41.997818 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 23 00:12:42 crc kubenswrapper[4735]: I0223 00:12:42.007492 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4qlrn"] Feb 23 00:12:42 crc kubenswrapper[4735]: I0223 00:12:42.057631 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2256\" (UniqueName: \"kubernetes.io/projected/faa0a6f2-8a6e-4c82-b876-9cc1cff42496-kube-api-access-c2256\") pod \"redhat-marketplace-6xh6c\" (UID: \"faa0a6f2-8a6e-4c82-b876-9cc1cff42496\") " pod="openshift-marketplace/redhat-marketplace-6xh6c" Feb 23 00:12:42 crc kubenswrapper[4735]: I0223 00:12:42.057689 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faa0a6f2-8a6e-4c82-b876-9cc1cff42496-utilities\") pod \"redhat-marketplace-6xh6c\" (UID: \"faa0a6f2-8a6e-4c82-b876-9cc1cff42496\") " pod="openshift-marketplace/redhat-marketplace-6xh6c" Feb 23 00:12:42 crc kubenswrapper[4735]: I0223 00:12:42.057715 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faa0a6f2-8a6e-4c82-b876-9cc1cff42496-catalog-content\") pod \"redhat-marketplace-6xh6c\" (UID: \"faa0a6f2-8a6e-4c82-b876-9cc1cff42496\") " pod="openshift-marketplace/redhat-marketplace-6xh6c" Feb 23 00:12:42 crc kubenswrapper[4735]: I0223 00:12:42.058253 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faa0a6f2-8a6e-4c82-b876-9cc1cff42496-catalog-content\") pod \"redhat-marketplace-6xh6c\" (UID: \"faa0a6f2-8a6e-4c82-b876-9cc1cff42496\") " pod="openshift-marketplace/redhat-marketplace-6xh6c" Feb 23 00:12:42 crc kubenswrapper[4735]: I0223 00:12:42.058416 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faa0a6f2-8a6e-4c82-b876-9cc1cff42496-utilities\") pod \"redhat-marketplace-6xh6c\" (UID: \"faa0a6f2-8a6e-4c82-b876-9cc1cff42496\") " pod="openshift-marketplace/redhat-marketplace-6xh6c" Feb 23 00:12:42 crc kubenswrapper[4735]: I0223 00:12:42.092011 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2256\" (UniqueName: \"kubernetes.io/projected/faa0a6f2-8a6e-4c82-b876-9cc1cff42496-kube-api-access-c2256\") pod \"redhat-marketplace-6xh6c\" (UID: \"faa0a6f2-8a6e-4c82-b876-9cc1cff42496\") " pod="openshift-marketplace/redhat-marketplace-6xh6c" Feb 23 00:12:42 crc kubenswrapper[4735]: I0223 00:12:42.120323 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6xh6c" Feb 23 00:12:42 crc kubenswrapper[4735]: I0223 00:12:42.159248 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c37bfa2-47b2-493c-a4ff-5342118dcf93-catalog-content\") pod \"certified-operators-4qlrn\" (UID: \"8c37bfa2-47b2-493c-a4ff-5342118dcf93\") " pod="openshift-marketplace/certified-operators-4qlrn" Feb 23 00:12:42 crc kubenswrapper[4735]: I0223 00:12:42.159447 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr2pj\" (UniqueName: \"kubernetes.io/projected/8c37bfa2-47b2-493c-a4ff-5342118dcf93-kube-api-access-rr2pj\") pod \"certified-operators-4qlrn\" (UID: \"8c37bfa2-47b2-493c-a4ff-5342118dcf93\") " pod="openshift-marketplace/certified-operators-4qlrn" Feb 23 00:12:42 crc kubenswrapper[4735]: I0223 00:12:42.159501 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c37bfa2-47b2-493c-a4ff-5342118dcf93-utilities\") pod \"certified-operators-4qlrn\" (UID: \"8c37bfa2-47b2-493c-a4ff-5342118dcf93\") " pod="openshift-marketplace/certified-operators-4qlrn" Feb 23 00:12:42 crc kubenswrapper[4735]: I0223 00:12:42.261632 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c37bfa2-47b2-493c-a4ff-5342118dcf93-catalog-content\") pod \"certified-operators-4qlrn\" (UID: \"8c37bfa2-47b2-493c-a4ff-5342118dcf93\") " pod="openshift-marketplace/certified-operators-4qlrn" Feb 23 00:12:42 crc kubenswrapper[4735]: I0223 00:12:42.261783 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr2pj\" (UniqueName: \"kubernetes.io/projected/8c37bfa2-47b2-493c-a4ff-5342118dcf93-kube-api-access-rr2pj\") pod \"certified-operators-4qlrn\" (UID: \"8c37bfa2-47b2-493c-a4ff-5342118dcf93\") " pod="openshift-marketplace/certified-operators-4qlrn" Feb 23 00:12:42 crc kubenswrapper[4735]: I0223 00:12:42.261823 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c37bfa2-47b2-493c-a4ff-5342118dcf93-utilities\") pod \"certified-operators-4qlrn\" (UID: \"8c37bfa2-47b2-493c-a4ff-5342118dcf93\") " pod="openshift-marketplace/certified-operators-4qlrn" Feb 23 00:12:42 crc kubenswrapper[4735]: I0223 00:12:42.263293 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c37bfa2-47b2-493c-a4ff-5342118dcf93-catalog-content\") pod \"certified-operators-4qlrn\" (UID: \"8c37bfa2-47b2-493c-a4ff-5342118dcf93\") " pod="openshift-marketplace/certified-operators-4qlrn" Feb 23 00:12:42 crc kubenswrapper[4735]: I0223 00:12:42.263344 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c37bfa2-47b2-493c-a4ff-5342118dcf93-utilities\") pod \"certified-operators-4qlrn\" (UID: \"8c37bfa2-47b2-493c-a4ff-5342118dcf93\") " pod="openshift-marketplace/certified-operators-4qlrn" Feb 23 00:12:42 crc kubenswrapper[4735]: I0223 00:12:42.286308 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr2pj\" (UniqueName: \"kubernetes.io/projected/8c37bfa2-47b2-493c-a4ff-5342118dcf93-kube-api-access-rr2pj\") pod \"certified-operators-4qlrn\" (UID: \"8c37bfa2-47b2-493c-a4ff-5342118dcf93\") " pod="openshift-marketplace/certified-operators-4qlrn" Feb 23 00:12:42 crc kubenswrapper[4735]: I0223 00:12:42.314064 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4qlrn" Feb 23 00:12:42 crc kubenswrapper[4735]: I0223 00:12:42.614398 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6xh6c"] Feb 23 00:12:42 crc kubenswrapper[4735]: I0223 00:12:42.796802 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4qlrn"] Feb 23 00:12:42 crc kubenswrapper[4735]: W0223 00:12:42.805336 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c37bfa2_47b2_493c_a4ff_5342118dcf93.slice/crio-dfcf40345388ba4848bbaaff94e7b6d2c0349e0049bf7bebbae5d90a26e2f983 WatchSource:0}: Error finding container dfcf40345388ba4848bbaaff94e7b6d2c0349e0049bf7bebbae5d90a26e2f983: Status 404 returned error can't find the container with id dfcf40345388ba4848bbaaff94e7b6d2c0349e0049bf7bebbae5d90a26e2f983 Feb 23 00:12:43 crc kubenswrapper[4735]: I0223 00:12:43.395817 4735 generic.go:334] "Generic (PLEG): container finished" podID="8c37bfa2-47b2-493c-a4ff-5342118dcf93" containerID="24f6f7f7033faeb7f01c660391caedc37ff5cb620ba11b81dfdbf60d40be908d" exitCode=0 Feb 23 00:12:43 crc kubenswrapper[4735]: I0223 00:12:43.395969 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4qlrn" event={"ID":"8c37bfa2-47b2-493c-a4ff-5342118dcf93","Type":"ContainerDied","Data":"24f6f7f7033faeb7f01c660391caedc37ff5cb620ba11b81dfdbf60d40be908d"} Feb 23 00:12:43 crc kubenswrapper[4735]: I0223 00:12:43.396308 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4qlrn" event={"ID":"8c37bfa2-47b2-493c-a4ff-5342118dcf93","Type":"ContainerStarted","Data":"dfcf40345388ba4848bbaaff94e7b6d2c0349e0049bf7bebbae5d90a26e2f983"} Feb 23 00:12:43 crc kubenswrapper[4735]: I0223 00:12:43.398518 4735 generic.go:334] "Generic (PLEG): container finished" podID="faa0a6f2-8a6e-4c82-b876-9cc1cff42496" containerID="8c389cbb5c056bec21981f5cdad2c756e85c2f9f9880f923c97b5a9fca2aa3a2" exitCode=0 Feb 23 00:12:43 crc kubenswrapper[4735]: I0223 00:12:43.398555 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6xh6c" event={"ID":"faa0a6f2-8a6e-4c82-b876-9cc1cff42496","Type":"ContainerDied","Data":"8c389cbb5c056bec21981f5cdad2c756e85c2f9f9880f923c97b5a9fca2aa3a2"} Feb 23 00:12:43 crc kubenswrapper[4735]: I0223 00:12:43.398593 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6xh6c" event={"ID":"faa0a6f2-8a6e-4c82-b876-9cc1cff42496","Type":"ContainerStarted","Data":"eb0eeb02ab2cb5fc6b6d6f2f07a9dd7db08a20387ac0d572e54966266f4d0c44"} Feb 23 00:12:43 crc kubenswrapper[4735]: I0223 00:12:43.581807 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dkjz8"] Feb 23 00:12:43 crc kubenswrapper[4735]: I0223 00:12:43.583359 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dkjz8" Feb 23 00:12:43 crc kubenswrapper[4735]: I0223 00:12:43.592215 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 23 00:12:43 crc kubenswrapper[4735]: I0223 00:12:43.597578 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dkjz8"] Feb 23 00:12:43 crc kubenswrapper[4735]: I0223 00:12:43.685281 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6zdl\" (UniqueName: \"kubernetes.io/projected/71b6652a-e5e4-4ae8-967e-440f3912eca7-kube-api-access-s6zdl\") pod \"redhat-operators-dkjz8\" (UID: \"71b6652a-e5e4-4ae8-967e-440f3912eca7\") " pod="openshift-marketplace/redhat-operators-dkjz8" Feb 23 00:12:43 crc kubenswrapper[4735]: I0223 00:12:43.685835 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71b6652a-e5e4-4ae8-967e-440f3912eca7-utilities\") pod \"redhat-operators-dkjz8\" (UID: \"71b6652a-e5e4-4ae8-967e-440f3912eca7\") " pod="openshift-marketplace/redhat-operators-dkjz8" Feb 23 00:12:43 crc kubenswrapper[4735]: I0223 00:12:43.686022 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71b6652a-e5e4-4ae8-967e-440f3912eca7-catalog-content\") pod \"redhat-operators-dkjz8\" (UID: \"71b6652a-e5e4-4ae8-967e-440f3912eca7\") " pod="openshift-marketplace/redhat-operators-dkjz8" Feb 23 00:12:43 crc kubenswrapper[4735]: I0223 00:12:43.787659 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6zdl\" (UniqueName: \"kubernetes.io/projected/71b6652a-e5e4-4ae8-967e-440f3912eca7-kube-api-access-s6zdl\") pod \"redhat-operators-dkjz8\" (UID: \"71b6652a-e5e4-4ae8-967e-440f3912eca7\") " pod="openshift-marketplace/redhat-operators-dkjz8" Feb 23 00:12:43 crc kubenswrapper[4735]: I0223 00:12:43.787754 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71b6652a-e5e4-4ae8-967e-440f3912eca7-utilities\") pod \"redhat-operators-dkjz8\" (UID: \"71b6652a-e5e4-4ae8-967e-440f3912eca7\") " pod="openshift-marketplace/redhat-operators-dkjz8" Feb 23 00:12:43 crc kubenswrapper[4735]: I0223 00:12:43.787782 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71b6652a-e5e4-4ae8-967e-440f3912eca7-catalog-content\") pod \"redhat-operators-dkjz8\" (UID: \"71b6652a-e5e4-4ae8-967e-440f3912eca7\") " pod="openshift-marketplace/redhat-operators-dkjz8" Feb 23 00:12:43 crc kubenswrapper[4735]: I0223 00:12:43.788888 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71b6652a-e5e4-4ae8-967e-440f3912eca7-utilities\") pod \"redhat-operators-dkjz8\" (UID: \"71b6652a-e5e4-4ae8-967e-440f3912eca7\") " pod="openshift-marketplace/redhat-operators-dkjz8" Feb 23 00:12:43 crc kubenswrapper[4735]: I0223 00:12:43.791248 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71b6652a-e5e4-4ae8-967e-440f3912eca7-catalog-content\") pod \"redhat-operators-dkjz8\" (UID: \"71b6652a-e5e4-4ae8-967e-440f3912eca7\") " pod="openshift-marketplace/redhat-operators-dkjz8" Feb 23 00:12:43 crc kubenswrapper[4735]: I0223 00:12:43.822156 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6zdl\" (UniqueName: \"kubernetes.io/projected/71b6652a-e5e4-4ae8-967e-440f3912eca7-kube-api-access-s6zdl\") pod \"redhat-operators-dkjz8\" (UID: \"71b6652a-e5e4-4ae8-967e-440f3912eca7\") " pod="openshift-marketplace/redhat-operators-dkjz8" Feb 23 00:12:43 crc kubenswrapper[4735]: I0223 00:12:43.937655 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dkjz8" Feb 23 00:12:44 crc kubenswrapper[4735]: I0223 00:12:44.380624 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dkjz8"] Feb 23 00:12:44 crc kubenswrapper[4735]: I0223 00:12:44.407669 4735 generic.go:334] "Generic (PLEG): container finished" podID="faa0a6f2-8a6e-4c82-b876-9cc1cff42496" containerID="2e3d02f2cb2f4484f35a457dbf5dd54200165dbe9807a50fcffc02719cc529fb" exitCode=0 Feb 23 00:12:44 crc kubenswrapper[4735]: I0223 00:12:44.407727 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6xh6c" event={"ID":"faa0a6f2-8a6e-4c82-b876-9cc1cff42496","Type":"ContainerDied","Data":"2e3d02f2cb2f4484f35a457dbf5dd54200165dbe9807a50fcffc02719cc529fb"} Feb 23 00:12:44 crc kubenswrapper[4735]: I0223 00:12:44.411543 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4qlrn" event={"ID":"8c37bfa2-47b2-493c-a4ff-5342118dcf93","Type":"ContainerStarted","Data":"a9d7de0a6416510bf8f7d309650170133aa561a50abca87d557696601e781ca6"} Feb 23 00:12:44 crc kubenswrapper[4735]: I0223 00:12:44.593406 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l486w"] Feb 23 00:12:44 crc kubenswrapper[4735]: I0223 00:12:44.594899 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l486w" Feb 23 00:12:44 crc kubenswrapper[4735]: I0223 00:12:44.597338 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 23 00:12:44 crc kubenswrapper[4735]: I0223 00:12:44.599601 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l486w"] Feb 23 00:12:44 crc kubenswrapper[4735]: I0223 00:12:44.698772 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51e3d9d6-dbc9-4600-a5f0-8e450b62b4c3-utilities\") pod \"community-operators-l486w\" (UID: \"51e3d9d6-dbc9-4600-a5f0-8e450b62b4c3\") " pod="openshift-marketplace/community-operators-l486w" Feb 23 00:12:44 crc kubenswrapper[4735]: I0223 00:12:44.698890 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqgqz\" (UniqueName: \"kubernetes.io/projected/51e3d9d6-dbc9-4600-a5f0-8e450b62b4c3-kube-api-access-kqgqz\") pod \"community-operators-l486w\" (UID: \"51e3d9d6-dbc9-4600-a5f0-8e450b62b4c3\") " pod="openshift-marketplace/community-operators-l486w" Feb 23 00:12:44 crc kubenswrapper[4735]: I0223 00:12:44.698933 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51e3d9d6-dbc9-4600-a5f0-8e450b62b4c3-catalog-content\") pod \"community-operators-l486w\" (UID: \"51e3d9d6-dbc9-4600-a5f0-8e450b62b4c3\") " pod="openshift-marketplace/community-operators-l486w" Feb 23 00:12:44 crc kubenswrapper[4735]: I0223 00:12:44.799579 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqgqz\" (UniqueName: \"kubernetes.io/projected/51e3d9d6-dbc9-4600-a5f0-8e450b62b4c3-kube-api-access-kqgqz\") pod \"community-operators-l486w\" (UID: \"51e3d9d6-dbc9-4600-a5f0-8e450b62b4c3\") " pod="openshift-marketplace/community-operators-l486w" Feb 23 00:12:44 crc kubenswrapper[4735]: I0223 00:12:44.799630 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51e3d9d6-dbc9-4600-a5f0-8e450b62b4c3-catalog-content\") pod \"community-operators-l486w\" (UID: \"51e3d9d6-dbc9-4600-a5f0-8e450b62b4c3\") " pod="openshift-marketplace/community-operators-l486w" Feb 23 00:12:44 crc kubenswrapper[4735]: I0223 00:12:44.799679 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51e3d9d6-dbc9-4600-a5f0-8e450b62b4c3-utilities\") pod \"community-operators-l486w\" (UID: \"51e3d9d6-dbc9-4600-a5f0-8e450b62b4c3\") " pod="openshift-marketplace/community-operators-l486w" Feb 23 00:12:44 crc kubenswrapper[4735]: I0223 00:12:44.800085 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51e3d9d6-dbc9-4600-a5f0-8e450b62b4c3-utilities\") pod \"community-operators-l486w\" (UID: \"51e3d9d6-dbc9-4600-a5f0-8e450b62b4c3\") " pod="openshift-marketplace/community-operators-l486w" Feb 23 00:12:44 crc kubenswrapper[4735]: I0223 00:12:44.800407 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51e3d9d6-dbc9-4600-a5f0-8e450b62b4c3-catalog-content\") pod \"community-operators-l486w\" (UID: \"51e3d9d6-dbc9-4600-a5f0-8e450b62b4c3\") " pod="openshift-marketplace/community-operators-l486w" Feb 23 00:12:44 crc kubenswrapper[4735]: I0223 00:12:44.831749 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqgqz\" (UniqueName: \"kubernetes.io/projected/51e3d9d6-dbc9-4600-a5f0-8e450b62b4c3-kube-api-access-kqgqz\") pod \"community-operators-l486w\" (UID: \"51e3d9d6-dbc9-4600-a5f0-8e450b62b4c3\") " pod="openshift-marketplace/community-operators-l486w" Feb 23 00:12:44 crc kubenswrapper[4735]: I0223 00:12:44.918598 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l486w" Feb 23 00:12:45 crc kubenswrapper[4735]: I0223 00:12:45.378662 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l486w"] Feb 23 00:12:45 crc kubenswrapper[4735]: W0223 00:12:45.390142 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51e3d9d6_dbc9_4600_a5f0_8e450b62b4c3.slice/crio-9b67b5b3136bb8f3cab6efb11588dd58a2f2e94167be25d94f5302ec5ac0af92 WatchSource:0}: Error finding container 9b67b5b3136bb8f3cab6efb11588dd58a2f2e94167be25d94f5302ec5ac0af92: Status 404 returned error can't find the container with id 9b67b5b3136bb8f3cab6efb11588dd58a2f2e94167be25d94f5302ec5ac0af92 Feb 23 00:12:45 crc kubenswrapper[4735]: I0223 00:12:45.420814 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6xh6c" event={"ID":"faa0a6f2-8a6e-4c82-b876-9cc1cff42496","Type":"ContainerStarted","Data":"3bd5e96205bb309bd24e97cb77f92dc3fa0c3bfed548bae7d8c23da1096f6ff7"} Feb 23 00:12:45 crc kubenswrapper[4735]: I0223 00:12:45.422388 4735 generic.go:334] "Generic (PLEG): container finished" podID="8c37bfa2-47b2-493c-a4ff-5342118dcf93" containerID="a9d7de0a6416510bf8f7d309650170133aa561a50abca87d557696601e781ca6" exitCode=0 Feb 23 00:12:45 crc kubenswrapper[4735]: I0223 00:12:45.422445 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4qlrn" event={"ID":"8c37bfa2-47b2-493c-a4ff-5342118dcf93","Type":"ContainerDied","Data":"a9d7de0a6416510bf8f7d309650170133aa561a50abca87d557696601e781ca6"} Feb 23 00:12:45 crc kubenswrapper[4735]: I0223 00:12:45.425251 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l486w" event={"ID":"51e3d9d6-dbc9-4600-a5f0-8e450b62b4c3","Type":"ContainerStarted","Data":"9b67b5b3136bb8f3cab6efb11588dd58a2f2e94167be25d94f5302ec5ac0af92"} Feb 23 00:12:45 crc kubenswrapper[4735]: I0223 00:12:45.427744 4735 generic.go:334] "Generic (PLEG): container finished" podID="71b6652a-e5e4-4ae8-967e-440f3912eca7" containerID="66936820dfa843801c91aff84bff7362bc979c42250d98682bd7d788c67fd055" exitCode=0 Feb 23 00:12:45 crc kubenswrapper[4735]: I0223 00:12:45.427790 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dkjz8" event={"ID":"71b6652a-e5e4-4ae8-967e-440f3912eca7","Type":"ContainerDied","Data":"66936820dfa843801c91aff84bff7362bc979c42250d98682bd7d788c67fd055"} Feb 23 00:12:45 crc kubenswrapper[4735]: I0223 00:12:45.427821 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dkjz8" event={"ID":"71b6652a-e5e4-4ae8-967e-440f3912eca7","Type":"ContainerStarted","Data":"b6d3be734dc16ece7c6d80c998067efc1e21018f4e2a00efca971e2d735ee3b1"} Feb 23 00:12:45 crc kubenswrapper[4735]: I0223 00:12:45.444228 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6xh6c" podStartSLOduration=3.0598259309999998 podStartE2EDuration="4.444211995s" podCreationTimestamp="2026-02-23 00:12:41 +0000 UTC" firstStartedPulling="2026-02-23 00:12:43.400623819 +0000 UTC m=+321.864169840" lastFinishedPulling="2026-02-23 00:12:44.785009933 +0000 UTC m=+323.248555904" observedRunningTime="2026-02-23 00:12:45.44255325 +0000 UTC m=+323.906099271" watchObservedRunningTime="2026-02-23 00:12:45.444211995 +0000 UTC m=+323.907757976" Feb 23 00:12:46 crc kubenswrapper[4735]: I0223 00:12:46.435601 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dkjz8" event={"ID":"71b6652a-e5e4-4ae8-967e-440f3912eca7","Type":"ContainerStarted","Data":"03f9c22ef88dfd35a44834f6682704a0bae3d4a67f9d2a09a1240f984e596d5e"} Feb 23 00:12:46 crc kubenswrapper[4735]: I0223 00:12:46.438994 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4qlrn" event={"ID":"8c37bfa2-47b2-493c-a4ff-5342118dcf93","Type":"ContainerStarted","Data":"a8d0247c5ff696548a8bfa619b3be0073afe7e6c44e09cb7f23cdcc44d9f733a"} Feb 23 00:12:46 crc kubenswrapper[4735]: I0223 00:12:46.441015 4735 generic.go:334] "Generic (PLEG): container finished" podID="51e3d9d6-dbc9-4600-a5f0-8e450b62b4c3" containerID="3b9b5a24ed576bbe7c5d7dbe7249ffcc7c45a9b5aa0ec0edf1cc5d08e29b15b5" exitCode=0 Feb 23 00:12:46 crc kubenswrapper[4735]: I0223 00:12:46.441063 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l486w" event={"ID":"51e3d9d6-dbc9-4600-a5f0-8e450b62b4c3","Type":"ContainerDied","Data":"3b9b5a24ed576bbe7c5d7dbe7249ffcc7c45a9b5aa0ec0edf1cc5d08e29b15b5"} Feb 23 00:12:46 crc kubenswrapper[4735]: I0223 00:12:46.498605 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4qlrn" podStartSLOduration=3.006886894 podStartE2EDuration="5.498581803s" podCreationTimestamp="2026-02-23 00:12:41 +0000 UTC" firstStartedPulling="2026-02-23 00:12:43.399268212 +0000 UTC m=+321.862814183" lastFinishedPulling="2026-02-23 00:12:45.890963121 +0000 UTC m=+324.354509092" observedRunningTime="2026-02-23 00:12:46.494582123 +0000 UTC m=+324.958128104" watchObservedRunningTime="2026-02-23 00:12:46.498581803 +0000 UTC m=+324.962127784" Feb 23 00:12:47 crc kubenswrapper[4735]: I0223 00:12:47.447991 4735 generic.go:334] "Generic (PLEG): container finished" podID="51e3d9d6-dbc9-4600-a5f0-8e450b62b4c3" containerID="5ebe6625616067798370e7f99550bdf54c9653dfb112780bcbdf5281668deb3e" exitCode=0 Feb 23 00:12:47 crc kubenswrapper[4735]: I0223 00:12:47.448097 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l486w" event={"ID":"51e3d9d6-dbc9-4600-a5f0-8e450b62b4c3","Type":"ContainerDied","Data":"5ebe6625616067798370e7f99550bdf54c9653dfb112780bcbdf5281668deb3e"} Feb 23 00:12:47 crc kubenswrapper[4735]: I0223 00:12:47.451052 4735 generic.go:334] "Generic (PLEG): container finished" podID="71b6652a-e5e4-4ae8-967e-440f3912eca7" containerID="03f9c22ef88dfd35a44834f6682704a0bae3d4a67f9d2a09a1240f984e596d5e" exitCode=0 Feb 23 00:12:47 crc kubenswrapper[4735]: I0223 00:12:47.451737 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dkjz8" event={"ID":"71b6652a-e5e4-4ae8-967e-440f3912eca7","Type":"ContainerDied","Data":"03f9c22ef88dfd35a44834f6682704a0bae3d4a67f9d2a09a1240f984e596d5e"} Feb 23 00:12:48 crc kubenswrapper[4735]: I0223 00:12:48.458585 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l486w" event={"ID":"51e3d9d6-dbc9-4600-a5f0-8e450b62b4c3","Type":"ContainerStarted","Data":"21a30bdac7870b8dd20e480fc574292f9cc465b4d246ce43f3f37040acea6729"} Feb 23 00:12:48 crc kubenswrapper[4735]: I0223 00:12:48.460985 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dkjz8" event={"ID":"71b6652a-e5e4-4ae8-967e-440f3912eca7","Type":"ContainerStarted","Data":"5db72b539f5e83f1f47760a712284d6fb044003b879bc0548521039ae6db3977"} Feb 23 00:12:48 crc kubenswrapper[4735]: I0223 00:12:48.480351 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l486w" podStartSLOduration=3.086887603 podStartE2EDuration="4.480326788s" podCreationTimestamp="2026-02-23 00:12:44 +0000 UTC" firstStartedPulling="2026-02-23 00:12:46.442759547 +0000 UTC m=+324.906305528" lastFinishedPulling="2026-02-23 00:12:47.836198742 +0000 UTC m=+326.299744713" observedRunningTime="2026-02-23 00:12:48.477050632 +0000 UTC m=+326.940596613" watchObservedRunningTime="2026-02-23 00:12:48.480326788 +0000 UTC m=+326.943872799" Feb 23 00:12:48 crc kubenswrapper[4735]: I0223 00:12:48.494669 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dkjz8" podStartSLOduration=3.074299738 podStartE2EDuration="5.494647331s" podCreationTimestamp="2026-02-23 00:12:43 +0000 UTC" firstStartedPulling="2026-02-23 00:12:45.429489633 +0000 UTC m=+323.893035624" lastFinishedPulling="2026-02-23 00:12:47.849837206 +0000 UTC m=+326.313383217" observedRunningTime="2026-02-23 00:12:48.491774968 +0000 UTC m=+326.955320939" watchObservedRunningTime="2026-02-23 00:12:48.494647331 +0000 UTC m=+326.958193302" Feb 23 00:12:52 crc kubenswrapper[4735]: I0223 00:12:52.121063 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6xh6c" Feb 23 00:12:52 crc kubenswrapper[4735]: I0223 00:12:52.121819 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6xh6c" Feb 23 00:12:52 crc kubenswrapper[4735]: I0223 00:12:52.172482 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6xh6c" Feb 23 00:12:52 crc kubenswrapper[4735]: I0223 00:12:52.315351 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4qlrn" Feb 23 00:12:52 crc kubenswrapper[4735]: I0223 00:12:52.315414 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4qlrn" Feb 23 00:12:52 crc kubenswrapper[4735]: I0223 00:12:52.379185 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4qlrn" Feb 23 00:12:52 crc kubenswrapper[4735]: I0223 00:12:52.524025 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6xh6c" Feb 23 00:12:52 crc kubenswrapper[4735]: I0223 00:12:52.567966 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4qlrn" Feb 23 00:12:53 crc kubenswrapper[4735]: I0223 00:12:53.939031 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dkjz8" Feb 23 00:12:53 crc kubenswrapper[4735]: I0223 00:12:53.939573 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dkjz8" Feb 23 00:12:54 crc kubenswrapper[4735]: I0223 00:12:54.919414 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l486w" Feb 23 00:12:54 crc kubenswrapper[4735]: I0223 00:12:54.919789 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l486w" Feb 23 00:12:54 crc kubenswrapper[4735]: I0223 00:12:54.964492 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l486w" Feb 23 00:12:55 crc kubenswrapper[4735]: I0223 00:12:55.000226 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dkjz8" podUID="71b6652a-e5e4-4ae8-967e-440f3912eca7" containerName="registry-server" probeResult="failure" output=< Feb 23 00:12:55 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Feb 23 00:12:55 crc kubenswrapper[4735]: > Feb 23 00:12:55 crc kubenswrapper[4735]: I0223 00:12:55.565072 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l486w" Feb 23 00:13:04 crc kubenswrapper[4735]: I0223 00:13:04.010244 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dkjz8" Feb 23 00:13:04 crc kubenswrapper[4735]: I0223 00:13:04.082631 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dkjz8" Feb 23 00:13:11 crc kubenswrapper[4735]: I0223 00:13:11.535895 4735 patch_prober.go:28] interesting pod/machine-config-daemon-blmnv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 00:13:11 crc kubenswrapper[4735]: I0223 00:13:11.536550 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" podUID="1cba474f-2d55-4a07-969f-25e2817a06d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 00:13:34 crc kubenswrapper[4735]: I0223 00:13:34.704559 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-9tphn"] Feb 23 00:13:34 crc kubenswrapper[4735]: I0223 00:13:34.707199 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-9tphn" Feb 23 00:13:34 crc kubenswrapper[4735]: I0223 00:13:34.720150 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-9tphn"] Feb 23 00:13:34 crc kubenswrapper[4735]: I0223 00:13:34.839245 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv4rm\" (UniqueName: \"kubernetes.io/projected/f156b1cb-cd40-42a5-82b3-96813df2038b-kube-api-access-xv4rm\") pod \"image-registry-66df7c8f76-9tphn\" (UID: \"f156b1cb-cd40-42a5-82b3-96813df2038b\") " pod="openshift-image-registry/image-registry-66df7c8f76-9tphn" Feb 23 00:13:34 crc kubenswrapper[4735]: I0223 00:13:34.839357 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f156b1cb-cd40-42a5-82b3-96813df2038b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-9tphn\" (UID: \"f156b1cb-cd40-42a5-82b3-96813df2038b\") " pod="openshift-image-registry/image-registry-66df7c8f76-9tphn" Feb 23 00:13:34 crc kubenswrapper[4735]: I0223 00:13:34.839440 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f156b1cb-cd40-42a5-82b3-96813df2038b-trusted-ca\") pod \"image-registry-66df7c8f76-9tphn\" (UID: \"f156b1cb-cd40-42a5-82b3-96813df2038b\") " pod="openshift-image-registry/image-registry-66df7c8f76-9tphn" Feb 23 00:13:34 crc kubenswrapper[4735]: I0223 00:13:34.839487 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f156b1cb-cd40-42a5-82b3-96813df2038b-registry-tls\") pod \"image-registry-66df7c8f76-9tphn\" (UID: \"f156b1cb-cd40-42a5-82b3-96813df2038b\") " pod="openshift-image-registry/image-registry-66df7c8f76-9tphn" Feb 23 00:13:34 crc kubenswrapper[4735]: I0223 00:13:34.839543 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-9tphn\" (UID: \"f156b1cb-cd40-42a5-82b3-96813df2038b\") " pod="openshift-image-registry/image-registry-66df7c8f76-9tphn" Feb 23 00:13:34 crc kubenswrapper[4735]: I0223 00:13:34.839629 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f156b1cb-cd40-42a5-82b3-96813df2038b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-9tphn\" (UID: \"f156b1cb-cd40-42a5-82b3-96813df2038b\") " pod="openshift-image-registry/image-registry-66df7c8f76-9tphn" Feb 23 00:13:34 crc kubenswrapper[4735]: I0223 00:13:34.839717 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f156b1cb-cd40-42a5-82b3-96813df2038b-bound-sa-token\") pod \"image-registry-66df7c8f76-9tphn\" (UID: \"f156b1cb-cd40-42a5-82b3-96813df2038b\") " pod="openshift-image-registry/image-registry-66df7c8f76-9tphn" Feb 23 00:13:34 crc kubenswrapper[4735]: I0223 00:13:34.839942 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f156b1cb-cd40-42a5-82b3-96813df2038b-registry-certificates\") pod \"image-registry-66df7c8f76-9tphn\" (UID: \"f156b1cb-cd40-42a5-82b3-96813df2038b\") " pod="openshift-image-registry/image-registry-66df7c8f76-9tphn" Feb 23 00:13:34 crc kubenswrapper[4735]: I0223 00:13:34.895486 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-9tphn\" (UID: \"f156b1cb-cd40-42a5-82b3-96813df2038b\") " pod="openshift-image-registry/image-registry-66df7c8f76-9tphn" Feb 23 00:13:34 crc kubenswrapper[4735]: I0223 00:13:34.941743 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f156b1cb-cd40-42a5-82b3-96813df2038b-trusted-ca\") pod \"image-registry-66df7c8f76-9tphn\" (UID: \"f156b1cb-cd40-42a5-82b3-96813df2038b\") " pod="openshift-image-registry/image-registry-66df7c8f76-9tphn" Feb 23 00:13:34 crc kubenswrapper[4735]: I0223 00:13:34.941830 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f156b1cb-cd40-42a5-82b3-96813df2038b-registry-tls\") pod \"image-registry-66df7c8f76-9tphn\" (UID: \"f156b1cb-cd40-42a5-82b3-96813df2038b\") " pod="openshift-image-registry/image-registry-66df7c8f76-9tphn" Feb 23 00:13:34 crc kubenswrapper[4735]: I0223 00:13:34.941943 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f156b1cb-cd40-42a5-82b3-96813df2038b-bound-sa-token\") pod \"image-registry-66df7c8f76-9tphn\" (UID: \"f156b1cb-cd40-42a5-82b3-96813df2038b\") " pod="openshift-image-registry/image-registry-66df7c8f76-9tphn" Feb 23 00:13:34 crc kubenswrapper[4735]: I0223 00:13:34.941982 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f156b1cb-cd40-42a5-82b3-96813df2038b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-9tphn\" (UID: \"f156b1cb-cd40-42a5-82b3-96813df2038b\") " pod="openshift-image-registry/image-registry-66df7c8f76-9tphn" Feb 23 00:13:34 crc kubenswrapper[4735]: I0223 00:13:34.942035 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f156b1cb-cd40-42a5-82b3-96813df2038b-registry-certificates\") pod \"image-registry-66df7c8f76-9tphn\" (UID: \"f156b1cb-cd40-42a5-82b3-96813df2038b\") " pod="openshift-image-registry/image-registry-66df7c8f76-9tphn" Feb 23 00:13:34 crc kubenswrapper[4735]: I0223 00:13:34.942098 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv4rm\" (UniqueName: \"kubernetes.io/projected/f156b1cb-cd40-42a5-82b3-96813df2038b-kube-api-access-xv4rm\") pod \"image-registry-66df7c8f76-9tphn\" (UID: \"f156b1cb-cd40-42a5-82b3-96813df2038b\") " pod="openshift-image-registry/image-registry-66df7c8f76-9tphn" Feb 23 00:13:34 crc kubenswrapper[4735]: I0223 00:13:34.942791 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f156b1cb-cd40-42a5-82b3-96813df2038b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-9tphn\" (UID: \"f156b1cb-cd40-42a5-82b3-96813df2038b\") " pod="openshift-image-registry/image-registry-66df7c8f76-9tphn" Feb 23 00:13:34 crc kubenswrapper[4735]: I0223 00:13:34.943350 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f156b1cb-cd40-42a5-82b3-96813df2038b-registry-certificates\") pod \"image-registry-66df7c8f76-9tphn\" (UID: \"f156b1cb-cd40-42a5-82b3-96813df2038b\") " pod="openshift-image-registry/image-registry-66df7c8f76-9tphn" Feb 23 00:13:34 crc kubenswrapper[4735]: I0223 00:13:34.943495 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f156b1cb-cd40-42a5-82b3-96813df2038b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-9tphn\" (UID: \"f156b1cb-cd40-42a5-82b3-96813df2038b\") " pod="openshift-image-registry/image-registry-66df7c8f76-9tphn" Feb 23 00:13:34 crc kubenswrapper[4735]: I0223 00:13:34.944222 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f156b1cb-cd40-42a5-82b3-96813df2038b-trusted-ca\") pod \"image-registry-66df7c8f76-9tphn\" (UID: \"f156b1cb-cd40-42a5-82b3-96813df2038b\") " pod="openshift-image-registry/image-registry-66df7c8f76-9tphn" Feb 23 00:13:34 crc kubenswrapper[4735]: I0223 00:13:34.948605 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f156b1cb-cd40-42a5-82b3-96813df2038b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-9tphn\" (UID: \"f156b1cb-cd40-42a5-82b3-96813df2038b\") " pod="openshift-image-registry/image-registry-66df7c8f76-9tphn" Feb 23 00:13:34 crc kubenswrapper[4735]: I0223 00:13:34.958991 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f156b1cb-cd40-42a5-82b3-96813df2038b-registry-tls\") pod \"image-registry-66df7c8f76-9tphn\" (UID: \"f156b1cb-cd40-42a5-82b3-96813df2038b\") " pod="openshift-image-registry/image-registry-66df7c8f76-9tphn" Feb 23 00:13:34 crc kubenswrapper[4735]: I0223 00:13:34.965350 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv4rm\" (UniqueName: \"kubernetes.io/projected/f156b1cb-cd40-42a5-82b3-96813df2038b-kube-api-access-xv4rm\") pod \"image-registry-66df7c8f76-9tphn\" (UID: \"f156b1cb-cd40-42a5-82b3-96813df2038b\") " pod="openshift-image-registry/image-registry-66df7c8f76-9tphn" Feb 23 00:13:34 crc kubenswrapper[4735]: I0223 00:13:34.970637 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f156b1cb-cd40-42a5-82b3-96813df2038b-bound-sa-token\") pod \"image-registry-66df7c8f76-9tphn\" (UID: \"f156b1cb-cd40-42a5-82b3-96813df2038b\") " pod="openshift-image-registry/image-registry-66df7c8f76-9tphn" Feb 23 00:13:35 crc kubenswrapper[4735]: I0223 00:13:35.023635 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-9tphn" Feb 23 00:13:35 crc kubenswrapper[4735]: I0223 00:13:35.510313 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-9tphn"] Feb 23 00:13:35 crc kubenswrapper[4735]: I0223 00:13:35.770127 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-9tphn" event={"ID":"f156b1cb-cd40-42a5-82b3-96813df2038b","Type":"ContainerStarted","Data":"0c949ec0a14427aab33f12a9a3da3e67fac1b70575059de1fa2ff602f1519660"} Feb 23 00:13:35 crc kubenswrapper[4735]: I0223 00:13:35.770500 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-9tphn" event={"ID":"f156b1cb-cd40-42a5-82b3-96813df2038b","Type":"ContainerStarted","Data":"f0ac879a96fa373396c01cdb59872da4cfb63f9a9fb60d1a270d40614938f776"} Feb 23 00:13:35 crc kubenswrapper[4735]: I0223 00:13:35.770539 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-9tphn" Feb 23 00:13:35 crc kubenswrapper[4735]: I0223 00:13:35.798002 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-9tphn" podStartSLOduration=1.797973732 podStartE2EDuration="1.797973732s" podCreationTimestamp="2026-02-23 00:13:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:13:35.794542774 +0000 UTC m=+374.258088765" watchObservedRunningTime="2026-02-23 00:13:35.797973732 +0000 UTC m=+374.261519733" Feb 23 00:13:41 crc kubenswrapper[4735]: I0223 00:13:41.513182 4735 patch_prober.go:28] interesting pod/machine-config-daemon-blmnv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 00:13:41 crc kubenswrapper[4735]: I0223 00:13:41.513819 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" podUID="1cba474f-2d55-4a07-969f-25e2817a06d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 00:13:55 crc kubenswrapper[4735]: I0223 00:13:55.032214 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-9tphn" Feb 23 00:13:55 crc kubenswrapper[4735]: I0223 00:13:55.100617 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6lsk2"] Feb 23 00:14:11 crc kubenswrapper[4735]: I0223 00:14:11.512589 4735 patch_prober.go:28] interesting pod/machine-config-daemon-blmnv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 00:14:11 crc kubenswrapper[4735]: I0223 00:14:11.513343 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" podUID="1cba474f-2d55-4a07-969f-25e2817a06d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 00:14:11 crc kubenswrapper[4735]: I0223 00:14:11.513413 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" Feb 23 00:14:11 crc kubenswrapper[4735]: I0223 00:14:11.514306 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"32a3e61de17574fe655e2f95d40d79b68f89d06de483f3a878f524bc13ce427d"} pod="openshift-machine-config-operator/machine-config-daemon-blmnv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 00:14:11 crc kubenswrapper[4735]: I0223 00:14:11.514441 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" podUID="1cba474f-2d55-4a07-969f-25e2817a06d0" containerName="machine-config-daemon" containerID="cri-o://32a3e61de17574fe655e2f95d40d79b68f89d06de483f3a878f524bc13ce427d" gracePeriod=600 Feb 23 00:14:12 crc kubenswrapper[4735]: I0223 00:14:12.030591 4735 generic.go:334] "Generic (PLEG): container finished" podID="1cba474f-2d55-4a07-969f-25e2817a06d0" containerID="32a3e61de17574fe655e2f95d40d79b68f89d06de483f3a878f524bc13ce427d" exitCode=0 Feb 23 00:14:12 crc kubenswrapper[4735]: I0223 00:14:12.030686 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" event={"ID":"1cba474f-2d55-4a07-969f-25e2817a06d0","Type":"ContainerDied","Data":"32a3e61de17574fe655e2f95d40d79b68f89d06de483f3a878f524bc13ce427d"} Feb 23 00:14:12 crc kubenswrapper[4735]: I0223 00:14:12.031050 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" event={"ID":"1cba474f-2d55-4a07-969f-25e2817a06d0","Type":"ContainerStarted","Data":"9ffff5ef8ce3a35f166e6769e8ce86e4bf9ce64b374895f07abf527d84d7182c"} Feb 23 00:14:12 crc kubenswrapper[4735]: I0223 00:14:12.031082 4735 scope.go:117] "RemoveContainer" containerID="908fd79ca8e314a794a24b502c23d0b5abbc41b2bf079086e8fe6fc3ceaeafa0" Feb 23 00:14:20 crc kubenswrapper[4735]: I0223 00:14:20.155365 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" podUID="769cd336-e909-4164-89c9-e0874926fd3d" containerName="registry" containerID="cri-o://f3b3df2f386cc3a232b61cb7782c538a5978f91757cf15847426019748f25a92" gracePeriod=30 Feb 23 00:14:20 crc kubenswrapper[4735]: I0223 00:14:20.594710 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:14:20 crc kubenswrapper[4735]: I0223 00:14:20.788955 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"769cd336-e909-4164-89c9-e0874926fd3d\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " Feb 23 00:14:20 crc kubenswrapper[4735]: I0223 00:14:20.789027 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/769cd336-e909-4164-89c9-e0874926fd3d-trusted-ca\") pod \"769cd336-e909-4164-89c9-e0874926fd3d\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " Feb 23 00:14:20 crc kubenswrapper[4735]: I0223 00:14:20.789080 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4b6f8\" (UniqueName: \"kubernetes.io/projected/769cd336-e909-4164-89c9-e0874926fd3d-kube-api-access-4b6f8\") pod \"769cd336-e909-4164-89c9-e0874926fd3d\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " Feb 23 00:14:20 crc kubenswrapper[4735]: I0223 00:14:20.789125 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/769cd336-e909-4164-89c9-e0874926fd3d-installation-pull-secrets\") pod \"769cd336-e909-4164-89c9-e0874926fd3d\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " Feb 23 00:14:20 crc kubenswrapper[4735]: I0223 00:14:20.789220 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/769cd336-e909-4164-89c9-e0874926fd3d-bound-sa-token\") pod \"769cd336-e909-4164-89c9-e0874926fd3d\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " Feb 23 00:14:20 crc kubenswrapper[4735]: I0223 00:14:20.789317 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/769cd336-e909-4164-89c9-e0874926fd3d-ca-trust-extracted\") pod \"769cd336-e909-4164-89c9-e0874926fd3d\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " Feb 23 00:14:20 crc kubenswrapper[4735]: I0223 00:14:20.789382 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/769cd336-e909-4164-89c9-e0874926fd3d-registry-tls\") pod \"769cd336-e909-4164-89c9-e0874926fd3d\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " Feb 23 00:14:20 crc kubenswrapper[4735]: I0223 00:14:20.789427 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/769cd336-e909-4164-89c9-e0874926fd3d-registry-certificates\") pod \"769cd336-e909-4164-89c9-e0874926fd3d\" (UID: \"769cd336-e909-4164-89c9-e0874926fd3d\") " Feb 23 00:14:20 crc kubenswrapper[4735]: I0223 00:14:20.791017 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/769cd336-e909-4164-89c9-e0874926fd3d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "769cd336-e909-4164-89c9-e0874926fd3d" (UID: "769cd336-e909-4164-89c9-e0874926fd3d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:14:20 crc kubenswrapper[4735]: I0223 00:14:20.791089 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/769cd336-e909-4164-89c9-e0874926fd3d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "769cd336-e909-4164-89c9-e0874926fd3d" (UID: "769cd336-e909-4164-89c9-e0874926fd3d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:14:20 crc kubenswrapper[4735]: I0223 00:14:20.799153 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/769cd336-e909-4164-89c9-e0874926fd3d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "769cd336-e909-4164-89c9-e0874926fd3d" (UID: "769cd336-e909-4164-89c9-e0874926fd3d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:14:20 crc kubenswrapper[4735]: I0223 00:14:20.799351 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/769cd336-e909-4164-89c9-e0874926fd3d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "769cd336-e909-4164-89c9-e0874926fd3d" (UID: "769cd336-e909-4164-89c9-e0874926fd3d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:14:20 crc kubenswrapper[4735]: I0223 00:14:20.799510 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/769cd336-e909-4164-89c9-e0874926fd3d-kube-api-access-4b6f8" (OuterVolumeSpecName: "kube-api-access-4b6f8") pod "769cd336-e909-4164-89c9-e0874926fd3d" (UID: "769cd336-e909-4164-89c9-e0874926fd3d"). InnerVolumeSpecName "kube-api-access-4b6f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:14:20 crc kubenswrapper[4735]: I0223 00:14:20.800611 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/769cd336-e909-4164-89c9-e0874926fd3d-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "769cd336-e909-4164-89c9-e0874926fd3d" (UID: "769cd336-e909-4164-89c9-e0874926fd3d"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:14:20 crc kubenswrapper[4735]: I0223 00:14:20.806022 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "769cd336-e909-4164-89c9-e0874926fd3d" (UID: "769cd336-e909-4164-89c9-e0874926fd3d"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 23 00:14:20 crc kubenswrapper[4735]: I0223 00:14:20.823403 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/769cd336-e909-4164-89c9-e0874926fd3d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "769cd336-e909-4164-89c9-e0874926fd3d" (UID: "769cd336-e909-4164-89c9-e0874926fd3d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:14:20 crc kubenswrapper[4735]: I0223 00:14:20.890611 4735 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/769cd336-e909-4164-89c9-e0874926fd3d-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 23 00:14:20 crc kubenswrapper[4735]: I0223 00:14:20.890659 4735 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/769cd336-e909-4164-89c9-e0874926fd3d-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 23 00:14:20 crc kubenswrapper[4735]: I0223 00:14:20.890678 4735 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/769cd336-e909-4164-89c9-e0874926fd3d-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 23 00:14:20 crc kubenswrapper[4735]: I0223 00:14:20.890699 4735 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/769cd336-e909-4164-89c9-e0874926fd3d-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 23 00:14:20 crc kubenswrapper[4735]: I0223 00:14:20.890716 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4b6f8\" (UniqueName: \"kubernetes.io/projected/769cd336-e909-4164-89c9-e0874926fd3d-kube-api-access-4b6f8\") on node \"crc\" DevicePath \"\"" Feb 23 00:14:20 crc kubenswrapper[4735]: I0223 00:14:20.890732 4735 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/769cd336-e909-4164-89c9-e0874926fd3d-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 23 00:14:20 crc kubenswrapper[4735]: I0223 00:14:20.890749 4735 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/769cd336-e909-4164-89c9-e0874926fd3d-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 23 00:14:21 crc kubenswrapper[4735]: I0223 00:14:21.102245 4735 generic.go:334] "Generic (PLEG): container finished" podID="769cd336-e909-4164-89c9-e0874926fd3d" containerID="f3b3df2f386cc3a232b61cb7782c538a5978f91757cf15847426019748f25a92" exitCode=0 Feb 23 00:14:21 crc kubenswrapper[4735]: I0223 00:14:21.102340 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" event={"ID":"769cd336-e909-4164-89c9-e0874926fd3d","Type":"ContainerDied","Data":"f3b3df2f386cc3a232b61cb7782c538a5978f91757cf15847426019748f25a92"} Feb 23 00:14:21 crc kubenswrapper[4735]: I0223 00:14:21.103039 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" event={"ID":"769cd336-e909-4164-89c9-e0874926fd3d","Type":"ContainerDied","Data":"8ce72b324ba51812a49f5b49552e6fe628b33d566a75b5a6efd4b6b301f0295b"} Feb 23 00:14:21 crc kubenswrapper[4735]: I0223 00:14:21.102381 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-6lsk2" Feb 23 00:14:21 crc kubenswrapper[4735]: I0223 00:14:21.103073 4735 scope.go:117] "RemoveContainer" containerID="f3b3df2f386cc3a232b61cb7782c538a5978f91757cf15847426019748f25a92" Feb 23 00:14:21 crc kubenswrapper[4735]: I0223 00:14:21.133692 4735 scope.go:117] "RemoveContainer" containerID="f3b3df2f386cc3a232b61cb7782c538a5978f91757cf15847426019748f25a92" Feb 23 00:14:21 crc kubenswrapper[4735]: E0223 00:14:21.135954 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3b3df2f386cc3a232b61cb7782c538a5978f91757cf15847426019748f25a92\": container with ID starting with f3b3df2f386cc3a232b61cb7782c538a5978f91757cf15847426019748f25a92 not found: ID does not exist" containerID="f3b3df2f386cc3a232b61cb7782c538a5978f91757cf15847426019748f25a92" Feb 23 00:14:21 crc kubenswrapper[4735]: I0223 00:14:21.136091 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3b3df2f386cc3a232b61cb7782c538a5978f91757cf15847426019748f25a92"} err="failed to get container status \"f3b3df2f386cc3a232b61cb7782c538a5978f91757cf15847426019748f25a92\": rpc error: code = NotFound desc = could not find container \"f3b3df2f386cc3a232b61cb7782c538a5978f91757cf15847426019748f25a92\": container with ID starting with f3b3df2f386cc3a232b61cb7782c538a5978f91757cf15847426019748f25a92 not found: ID does not exist" Feb 23 00:14:21 crc kubenswrapper[4735]: I0223 00:14:21.160354 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6lsk2"] Feb 23 00:14:21 crc kubenswrapper[4735]: I0223 00:14:21.170153 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-6lsk2"] Feb 23 00:14:22 crc kubenswrapper[4735]: I0223 00:14:22.300839 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="769cd336-e909-4164-89c9-e0874926fd3d" path="/var/lib/kubelet/pods/769cd336-e909-4164-89c9-e0874926fd3d/volumes" Feb 23 00:15:00 crc kubenswrapper[4735]: I0223 00:15:00.199816 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530095-7fhdc"] Feb 23 00:15:00 crc kubenswrapper[4735]: E0223 00:15:00.200824 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="769cd336-e909-4164-89c9-e0874926fd3d" containerName="registry" Feb 23 00:15:00 crc kubenswrapper[4735]: I0223 00:15:00.200845 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="769cd336-e909-4164-89c9-e0874926fd3d" containerName="registry" Feb 23 00:15:00 crc kubenswrapper[4735]: I0223 00:15:00.201098 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="769cd336-e909-4164-89c9-e0874926fd3d" containerName="registry" Feb 23 00:15:00 crc kubenswrapper[4735]: I0223 00:15:00.201899 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530095-7fhdc" Feb 23 00:15:00 crc kubenswrapper[4735]: I0223 00:15:00.205174 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 00:15:00 crc kubenswrapper[4735]: I0223 00:15:00.205222 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 00:15:00 crc kubenswrapper[4735]: I0223 00:15:00.211881 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530095-7fhdc"] Feb 23 00:15:00 crc kubenswrapper[4735]: I0223 00:15:00.255960 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e3fba12-4715-4105-bbef-e877d9c3582b-config-volume\") pod \"collect-profiles-29530095-7fhdc\" (UID: \"5e3fba12-4715-4105-bbef-e877d9c3582b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530095-7fhdc" Feb 23 00:15:00 crc kubenswrapper[4735]: I0223 00:15:00.256032 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmgx8\" (UniqueName: \"kubernetes.io/projected/5e3fba12-4715-4105-bbef-e877d9c3582b-kube-api-access-fmgx8\") pod \"collect-profiles-29530095-7fhdc\" (UID: \"5e3fba12-4715-4105-bbef-e877d9c3582b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530095-7fhdc" Feb 23 00:15:00 crc kubenswrapper[4735]: I0223 00:15:00.256176 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e3fba12-4715-4105-bbef-e877d9c3582b-secret-volume\") pod \"collect-profiles-29530095-7fhdc\" (UID: \"5e3fba12-4715-4105-bbef-e877d9c3582b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530095-7fhdc" Feb 23 00:15:00 crc kubenswrapper[4735]: I0223 00:15:00.356891 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e3fba12-4715-4105-bbef-e877d9c3582b-config-volume\") pod \"collect-profiles-29530095-7fhdc\" (UID: \"5e3fba12-4715-4105-bbef-e877d9c3582b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530095-7fhdc" Feb 23 00:15:00 crc kubenswrapper[4735]: I0223 00:15:00.357205 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmgx8\" (UniqueName: \"kubernetes.io/projected/5e3fba12-4715-4105-bbef-e877d9c3582b-kube-api-access-fmgx8\") pod \"collect-profiles-29530095-7fhdc\" (UID: \"5e3fba12-4715-4105-bbef-e877d9c3582b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530095-7fhdc" Feb 23 00:15:00 crc kubenswrapper[4735]: I0223 00:15:00.357308 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e3fba12-4715-4105-bbef-e877d9c3582b-secret-volume\") pod \"collect-profiles-29530095-7fhdc\" (UID: \"5e3fba12-4715-4105-bbef-e877d9c3582b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530095-7fhdc" Feb 23 00:15:00 crc kubenswrapper[4735]: I0223 00:15:00.358430 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e3fba12-4715-4105-bbef-e877d9c3582b-config-volume\") pod \"collect-profiles-29530095-7fhdc\" (UID: \"5e3fba12-4715-4105-bbef-e877d9c3582b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530095-7fhdc" Feb 23 00:15:00 crc kubenswrapper[4735]: I0223 00:15:00.366014 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e3fba12-4715-4105-bbef-e877d9c3582b-secret-volume\") pod \"collect-profiles-29530095-7fhdc\" (UID: \"5e3fba12-4715-4105-bbef-e877d9c3582b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530095-7fhdc" Feb 23 00:15:00 crc kubenswrapper[4735]: I0223 00:15:00.374831 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmgx8\" (UniqueName: \"kubernetes.io/projected/5e3fba12-4715-4105-bbef-e877d9c3582b-kube-api-access-fmgx8\") pod \"collect-profiles-29530095-7fhdc\" (UID: \"5e3fba12-4715-4105-bbef-e877d9c3582b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530095-7fhdc" Feb 23 00:15:00 crc kubenswrapper[4735]: I0223 00:15:00.521563 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530095-7fhdc" Feb 23 00:15:00 crc kubenswrapper[4735]: I0223 00:15:00.724167 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530095-7fhdc"] Feb 23 00:15:01 crc kubenswrapper[4735]: I0223 00:15:01.398796 4735 generic.go:334] "Generic (PLEG): container finished" podID="5e3fba12-4715-4105-bbef-e877d9c3582b" containerID="c01e6a62ebcdd50282453c514fe20a69766ea19f9cbd8c27fbac0216c29f3978" exitCode=0 Feb 23 00:15:01 crc kubenswrapper[4735]: I0223 00:15:01.398913 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530095-7fhdc" event={"ID":"5e3fba12-4715-4105-bbef-e877d9c3582b","Type":"ContainerDied","Data":"c01e6a62ebcdd50282453c514fe20a69766ea19f9cbd8c27fbac0216c29f3978"} Feb 23 00:15:01 crc kubenswrapper[4735]: I0223 00:15:01.399174 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530095-7fhdc" event={"ID":"5e3fba12-4715-4105-bbef-e877d9c3582b","Type":"ContainerStarted","Data":"8add24a896b9e4d12a30932a25a703724cb765771198f479978dbc4f48f49f17"} Feb 23 00:15:02 crc kubenswrapper[4735]: I0223 00:15:02.729042 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530095-7fhdc" Feb 23 00:15:02 crc kubenswrapper[4735]: I0223 00:15:02.806563 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e3fba12-4715-4105-bbef-e877d9c3582b-secret-volume\") pod \"5e3fba12-4715-4105-bbef-e877d9c3582b\" (UID: \"5e3fba12-4715-4105-bbef-e877d9c3582b\") " Feb 23 00:15:02 crc kubenswrapper[4735]: I0223 00:15:02.806747 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmgx8\" (UniqueName: \"kubernetes.io/projected/5e3fba12-4715-4105-bbef-e877d9c3582b-kube-api-access-fmgx8\") pod \"5e3fba12-4715-4105-bbef-e877d9c3582b\" (UID: \"5e3fba12-4715-4105-bbef-e877d9c3582b\") " Feb 23 00:15:02 crc kubenswrapper[4735]: I0223 00:15:02.806817 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e3fba12-4715-4105-bbef-e877d9c3582b-config-volume\") pod \"5e3fba12-4715-4105-bbef-e877d9c3582b\" (UID: \"5e3fba12-4715-4105-bbef-e877d9c3582b\") " Feb 23 00:15:02 crc kubenswrapper[4735]: I0223 00:15:02.807815 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e3fba12-4715-4105-bbef-e877d9c3582b-config-volume" (OuterVolumeSpecName: "config-volume") pod "5e3fba12-4715-4105-bbef-e877d9c3582b" (UID: "5e3fba12-4715-4105-bbef-e877d9c3582b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:15:02 crc kubenswrapper[4735]: I0223 00:15:02.813828 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e3fba12-4715-4105-bbef-e877d9c3582b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5e3fba12-4715-4105-bbef-e877d9c3582b" (UID: "5e3fba12-4715-4105-bbef-e877d9c3582b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:15:02 crc kubenswrapper[4735]: I0223 00:15:02.814088 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e3fba12-4715-4105-bbef-e877d9c3582b-kube-api-access-fmgx8" (OuterVolumeSpecName: "kube-api-access-fmgx8") pod "5e3fba12-4715-4105-bbef-e877d9c3582b" (UID: "5e3fba12-4715-4105-bbef-e877d9c3582b"). InnerVolumeSpecName "kube-api-access-fmgx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:15:02 crc kubenswrapper[4735]: I0223 00:15:02.908993 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmgx8\" (UniqueName: \"kubernetes.io/projected/5e3fba12-4715-4105-bbef-e877d9c3582b-kube-api-access-fmgx8\") on node \"crc\" DevicePath \"\"" Feb 23 00:15:02 crc kubenswrapper[4735]: I0223 00:15:02.909041 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e3fba12-4715-4105-bbef-e877d9c3582b-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 00:15:02 crc kubenswrapper[4735]: I0223 00:15:02.909061 4735 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5e3fba12-4715-4105-bbef-e877d9c3582b-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 00:15:03 crc kubenswrapper[4735]: I0223 00:15:03.414265 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530095-7fhdc" event={"ID":"5e3fba12-4715-4105-bbef-e877d9c3582b","Type":"ContainerDied","Data":"8add24a896b9e4d12a30932a25a703724cb765771198f479978dbc4f48f49f17"} Feb 23 00:15:03 crc kubenswrapper[4735]: I0223 00:15:03.414304 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8add24a896b9e4d12a30932a25a703724cb765771198f479978dbc4f48f49f17" Feb 23 00:15:03 crc kubenswrapper[4735]: I0223 00:15:03.414336 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530095-7fhdc" Feb 23 00:16:11 crc kubenswrapper[4735]: I0223 00:16:11.513456 4735 patch_prober.go:28] interesting pod/machine-config-daemon-blmnv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 00:16:11 crc kubenswrapper[4735]: I0223 00:16:11.514399 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" podUID="1cba474f-2d55-4a07-969f-25e2817a06d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 00:16:22 crc kubenswrapper[4735]: I0223 00:16:22.545449 4735 scope.go:117] "RemoveContainer" containerID="1b5e2521901e4676f48813d4d5cd59f8085da07bea1a63541f3a7349f76b9c4c" Feb 23 00:16:22 crc kubenswrapper[4735]: I0223 00:16:22.578007 4735 scope.go:117] "RemoveContainer" containerID="abc155ad6df61fdd08392e424628e1cee5325189224dbd35ce2627b3683726ae" Feb 23 00:16:41 crc kubenswrapper[4735]: I0223 00:16:41.512504 4735 patch_prober.go:28] interesting pod/machine-config-daemon-blmnv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 00:16:41 crc kubenswrapper[4735]: I0223 00:16:41.513264 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" podUID="1cba474f-2d55-4a07-969f-25e2817a06d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 00:16:47 crc kubenswrapper[4735]: I0223 00:16:47.675262 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-59rkm"] Feb 23 00:16:47 crc kubenswrapper[4735]: I0223 00:16:47.679135 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="ovn-controller" containerID="cri-o://9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2" gracePeriod=30 Feb 23 00:16:47 crc kubenswrapper[4735]: I0223 00:16:47.679362 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="northd" containerID="cri-o://2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd" gracePeriod=30 Feb 23 00:16:47 crc kubenswrapper[4735]: I0223 00:16:47.679466 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716" gracePeriod=30 Feb 23 00:16:47 crc kubenswrapper[4735]: I0223 00:16:47.679579 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="kube-rbac-proxy-node" containerID="cri-o://8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442" gracePeriod=30 Feb 23 00:16:47 crc kubenswrapper[4735]: I0223 00:16:47.679610 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="sbdb" containerID="cri-o://fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f" gracePeriod=30 Feb 23 00:16:47 crc kubenswrapper[4735]: I0223 00:16:47.679152 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="nbdb" containerID="cri-o://a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527" gracePeriod=30 Feb 23 00:16:47 crc kubenswrapper[4735]: I0223 00:16:47.679708 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="ovn-acl-logging" containerID="cri-o://b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7" gracePeriod=30 Feb 23 00:16:47 crc kubenswrapper[4735]: I0223 00:16:47.732089 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="ovnkube-controller" containerID="cri-o://d86577bcf7609c1352ab606dcac0b9e98198dfdfa5e9d9d98e8aaad6ce7f2e1e" gracePeriod=30 Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.006773 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-59rkm_66853c8a-9391-4291-b5f1-c72cb5fe23e8/ovnkube-controller/3.log" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.009130 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-59rkm_66853c8a-9391-4291-b5f1-c72cb5fe23e8/ovn-acl-logging/0.log" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.009794 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-59rkm_66853c8a-9391-4291-b5f1-c72cb5fe23e8/ovn-controller/0.log" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.010374 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.060669 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-j82tt"] Feb 23 00:16:48 crc kubenswrapper[4735]: E0223 00:16:48.061171 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="ovnkube-controller" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.061184 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="ovnkube-controller" Feb 23 00:16:48 crc kubenswrapper[4735]: E0223 00:16:48.061194 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="ovn-controller" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.061200 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="ovn-controller" Feb 23 00:16:48 crc kubenswrapper[4735]: E0223 00:16:48.061208 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="nbdb" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.061215 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="nbdb" Feb 23 00:16:48 crc kubenswrapper[4735]: E0223 00:16:48.061225 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="ovnkube-controller" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.061230 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="ovnkube-controller" Feb 23 00:16:48 crc kubenswrapper[4735]: E0223 00:16:48.061239 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="ovn-acl-logging" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.061244 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="ovn-acl-logging" Feb 23 00:16:48 crc kubenswrapper[4735]: E0223 00:16:48.061253 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="kube-rbac-proxy-node" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.061259 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="kube-rbac-proxy-node" Feb 23 00:16:48 crc kubenswrapper[4735]: E0223 00:16:48.061266 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="kubecfg-setup" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.061271 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="kubecfg-setup" Feb 23 00:16:48 crc kubenswrapper[4735]: E0223 00:16:48.061279 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="ovnkube-controller" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.061284 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="ovnkube-controller" Feb 23 00:16:48 crc kubenswrapper[4735]: E0223 00:16:48.061292 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="sbdb" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.061298 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="sbdb" Feb 23 00:16:48 crc kubenswrapper[4735]: E0223 00:16:48.061307 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="ovnkube-controller" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.061313 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="ovnkube-controller" Feb 23 00:16:48 crc kubenswrapper[4735]: E0223 00:16:48.061321 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="northd" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.061327 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="northd" Feb 23 00:16:48 crc kubenswrapper[4735]: E0223 00:16:48.061335 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e3fba12-4715-4105-bbef-e877d9c3582b" containerName="collect-profiles" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.061341 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e3fba12-4715-4105-bbef-e877d9c3582b" containerName="collect-profiles" Feb 23 00:16:48 crc kubenswrapper[4735]: E0223 00:16:48.061347 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="kube-rbac-proxy-ovn-metrics" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.061353 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="kube-rbac-proxy-ovn-metrics" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.061435 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="ovnkube-controller" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.061443 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="ovn-acl-logging" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.061450 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="ovnkube-controller" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.061458 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e3fba12-4715-4105-bbef-e877d9c3582b" containerName="collect-profiles" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.061465 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="kube-rbac-proxy-ovn-metrics" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.061471 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="northd" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.061479 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="kube-rbac-proxy-node" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.061487 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="sbdb" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.061493 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="ovnkube-controller" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.061500 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="nbdb" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.061507 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="ovn-controller" Feb 23 00:16:48 crc kubenswrapper[4735]: E0223 00:16:48.061594 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="ovnkube-controller" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.061602 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="ovnkube-controller" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.061707 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="ovnkube-controller" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.061716 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerName="ovnkube-controller" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.063206 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.138735 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-run-netns\") pod \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.138782 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-cni-netd\") pod \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.138811 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-run-openvswitch\") pod \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.138876 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/66853c8a-9391-4291-b5f1-c72cb5fe23e8-ovnkube-config\") pod \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.138910 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-run-ovn\") pod \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.138942 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.138958 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "66853c8a-9391-4291-b5f1-c72cb5fe23e8" (UID: "66853c8a-9391-4291-b5f1-c72cb5fe23e8"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.138972 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-slash\") pod \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.139006 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-slash" (OuterVolumeSpecName: "host-slash") pod "66853c8a-9391-4291-b5f1-c72cb5fe23e8" (UID: "66853c8a-9391-4291-b5f1-c72cb5fe23e8"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.139043 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-run-ovn-kubernetes\") pod \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.139080 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-log-socket\") pod \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.139130 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/66853c8a-9391-4291-b5f1-c72cb5fe23e8-ovnkube-script-lib\") pod \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.139167 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-etc-openvswitch\") pod \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.139200 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/66853c8a-9391-4291-b5f1-c72cb5fe23e8-env-overrides\") pod \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.139232 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-var-lib-openvswitch\") pod \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.139282 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzvft\" (UniqueName: \"kubernetes.io/projected/66853c8a-9391-4291-b5f1-c72cb5fe23e8-kube-api-access-tzvft\") pod \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.139308 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-run-systemd\") pod \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.139343 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-kubelet\") pod \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.139376 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-systemd-units\") pod \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.139404 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-node-log\") pod \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.139436 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-cni-bin\") pod \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.139480 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/66853c8a-9391-4291-b5f1-c72cb5fe23e8-ovn-node-metrics-cert\") pod \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\" (UID: \"66853c8a-9391-4291-b5f1-c72cb5fe23e8\") " Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.139750 4735 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.139776 4735 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-slash\") on node \"crc\" DevicePath \"\"" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.139047 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "66853c8a-9391-4291-b5f1-c72cb5fe23e8" (UID: "66853c8a-9391-4291-b5f1-c72cb5fe23e8"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.139626 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66853c8a-9391-4291-b5f1-c72cb5fe23e8-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "66853c8a-9391-4291-b5f1-c72cb5fe23e8" (UID: "66853c8a-9391-4291-b5f1-c72cb5fe23e8"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.139659 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "66853c8a-9391-4291-b5f1-c72cb5fe23e8" (UID: "66853c8a-9391-4291-b5f1-c72cb5fe23e8"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.139686 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "66853c8a-9391-4291-b5f1-c72cb5fe23e8" (UID: "66853c8a-9391-4291-b5f1-c72cb5fe23e8"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.140942 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66853c8a-9391-4291-b5f1-c72cb5fe23e8-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "66853c8a-9391-4291-b5f1-c72cb5fe23e8" (UID: "66853c8a-9391-4291-b5f1-c72cb5fe23e8"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.140955 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "66853c8a-9391-4291-b5f1-c72cb5fe23e8" (UID: "66853c8a-9391-4291-b5f1-c72cb5fe23e8"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.140993 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-log-socket" (OuterVolumeSpecName: "log-socket") pod "66853c8a-9391-4291-b5f1-c72cb5fe23e8" (UID: "66853c8a-9391-4291-b5f1-c72cb5fe23e8"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.140993 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "66853c8a-9391-4291-b5f1-c72cb5fe23e8" (UID: "66853c8a-9391-4291-b5f1-c72cb5fe23e8"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.141081 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "66853c8a-9391-4291-b5f1-c72cb5fe23e8" (UID: "66853c8a-9391-4291-b5f1-c72cb5fe23e8"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.141123 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "66853c8a-9391-4291-b5f1-c72cb5fe23e8" (UID: "66853c8a-9391-4291-b5f1-c72cb5fe23e8"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.141156 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "66853c8a-9391-4291-b5f1-c72cb5fe23e8" (UID: "66853c8a-9391-4291-b5f1-c72cb5fe23e8"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.141494 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66853c8a-9391-4291-b5f1-c72cb5fe23e8-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "66853c8a-9391-4291-b5f1-c72cb5fe23e8" (UID: "66853c8a-9391-4291-b5f1-c72cb5fe23e8"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.141998 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-node-log" (OuterVolumeSpecName: "node-log") pod "66853c8a-9391-4291-b5f1-c72cb5fe23e8" (UID: "66853c8a-9391-4291-b5f1-c72cb5fe23e8"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.142052 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "66853c8a-9391-4291-b5f1-c72cb5fe23e8" (UID: "66853c8a-9391-4291-b5f1-c72cb5fe23e8"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.142132 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "66853c8a-9391-4291-b5f1-c72cb5fe23e8" (UID: "66853c8a-9391-4291-b5f1-c72cb5fe23e8"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.144953 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66853c8a-9391-4291-b5f1-c72cb5fe23e8-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "66853c8a-9391-4291-b5f1-c72cb5fe23e8" (UID: "66853c8a-9391-4291-b5f1-c72cb5fe23e8"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.146355 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66853c8a-9391-4291-b5f1-c72cb5fe23e8-kube-api-access-tzvft" (OuterVolumeSpecName: "kube-api-access-tzvft") pod "66853c8a-9391-4291-b5f1-c72cb5fe23e8" (UID: "66853c8a-9391-4291-b5f1-c72cb5fe23e8"). InnerVolumeSpecName "kube-api-access-tzvft". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.146395 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4gvxr_5b63c18f-b6b2-4d97-b542-7800b475bd4c/kube-multus/2.log" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.147042 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4gvxr_5b63c18f-b6b2-4d97-b542-7800b475bd4c/kube-multus/1.log" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.147108 4735 generic.go:334] "Generic (PLEG): container finished" podID="5b63c18f-b6b2-4d97-b542-7800b475bd4c" containerID="89011481e42515946009b35bf0cf23e12f73377615e05b69a8471c4968e6bc01" exitCode=2 Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.147198 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4gvxr" event={"ID":"5b63c18f-b6b2-4d97-b542-7800b475bd4c","Type":"ContainerDied","Data":"89011481e42515946009b35bf0cf23e12f73377615e05b69a8471c4968e6bc01"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.147256 4735 scope.go:117] "RemoveContainer" containerID="5cd4fa9902eb9e1566216eb36f85b29b30e6a4a3f687b029c77356de5620c5e0" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.147781 4735 scope.go:117] "RemoveContainer" containerID="89011481e42515946009b35bf0cf23e12f73377615e05b69a8471c4968e6bc01" Feb 23 00:16:48 crc kubenswrapper[4735]: E0223 00:16:48.148125 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-4gvxr_openshift-multus(5b63c18f-b6b2-4d97-b542-7800b475bd4c)\"" pod="openshift-multus/multus-4gvxr" podUID="5b63c18f-b6b2-4d97-b542-7800b475bd4c" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.153061 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-59rkm_66853c8a-9391-4291-b5f1-c72cb5fe23e8/ovnkube-controller/3.log" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.158371 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-59rkm_66853c8a-9391-4291-b5f1-c72cb5fe23e8/ovn-acl-logging/0.log" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159001 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-59rkm_66853c8a-9391-4291-b5f1-c72cb5fe23e8/ovn-controller/0.log" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159359 4735 generic.go:334] "Generic (PLEG): container finished" podID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerID="d86577bcf7609c1352ab606dcac0b9e98198dfdfa5e9d9d98e8aaad6ce7f2e1e" exitCode=0 Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159400 4735 generic.go:334] "Generic (PLEG): container finished" podID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerID="fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f" exitCode=0 Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159413 4735 generic.go:334] "Generic (PLEG): container finished" podID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerID="a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527" exitCode=0 Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159421 4735 generic.go:334] "Generic (PLEG): container finished" podID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerID="2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd" exitCode=0 Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159429 4735 generic.go:334] "Generic (PLEG): container finished" podID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerID="b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716" exitCode=0 Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159438 4735 generic.go:334] "Generic (PLEG): container finished" podID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerID="8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442" exitCode=0 Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159447 4735 generic.go:334] "Generic (PLEG): container finished" podID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerID="b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7" exitCode=143 Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159456 4735 generic.go:334] "Generic (PLEG): container finished" podID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" containerID="9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2" exitCode=143 Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159478 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" event={"ID":"66853c8a-9391-4291-b5f1-c72cb5fe23e8","Type":"ContainerDied","Data":"d86577bcf7609c1352ab606dcac0b9e98198dfdfa5e9d9d98e8aaad6ce7f2e1e"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159508 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" event={"ID":"66853c8a-9391-4291-b5f1-c72cb5fe23e8","Type":"ContainerDied","Data":"fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159523 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" event={"ID":"66853c8a-9391-4291-b5f1-c72cb5fe23e8","Type":"ContainerDied","Data":"a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159535 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" event={"ID":"66853c8a-9391-4291-b5f1-c72cb5fe23e8","Type":"ContainerDied","Data":"2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159549 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" event={"ID":"66853c8a-9391-4291-b5f1-c72cb5fe23e8","Type":"ContainerDied","Data":"b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159561 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" event={"ID":"66853c8a-9391-4291-b5f1-c72cb5fe23e8","Type":"ContainerDied","Data":"8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159577 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d86577bcf7609c1352ab606dcac0b9e98198dfdfa5e9d9d98e8aaad6ce7f2e1e"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159589 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159596 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159603 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159610 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159616 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159623 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159629 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159636 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159643 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159653 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" event={"ID":"66853c8a-9391-4291-b5f1-c72cb5fe23e8","Type":"ContainerDied","Data":"b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159663 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d86577bcf7609c1352ab606dcac0b9e98198dfdfa5e9d9d98e8aaad6ce7f2e1e"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159671 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159678 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159684 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159692 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159698 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159705 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159711 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159717 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159723 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159732 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" event={"ID":"66853c8a-9391-4291-b5f1-c72cb5fe23e8","Type":"ContainerDied","Data":"9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159742 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d86577bcf7609c1352ab606dcac0b9e98198dfdfa5e9d9d98e8aaad6ce7f2e1e"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159750 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159757 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159764 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159770 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159775 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159783 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159790 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159796 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159802 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159811 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" event={"ID":"66853c8a-9391-4291-b5f1-c72cb5fe23e8","Type":"ContainerDied","Data":"c050d294b8e371b56c70146fe65693fe67fe1c27d70e8f710586a0ea9aff1f67"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159821 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d86577bcf7609c1352ab606dcac0b9e98198dfdfa5e9d9d98e8aaad6ce7f2e1e"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159828 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159836 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159843 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159868 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159876 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159882 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159889 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159895 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.159902 4735 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee"} Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.160088 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-59rkm" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.167074 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "66853c8a-9391-4291-b5f1-c72cb5fe23e8" (UID: "66853c8a-9391-4291-b5f1-c72cb5fe23e8"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.172555 4735 scope.go:117] "RemoveContainer" containerID="d86577bcf7609c1352ab606dcac0b9e98198dfdfa5e9d9d98e8aaad6ce7f2e1e" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.189989 4735 scope.go:117] "RemoveContainer" containerID="dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.209123 4735 scope.go:117] "RemoveContainer" containerID="fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.219687 4735 scope.go:117] "RemoveContainer" containerID="a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.237463 4735 scope.go:117] "RemoveContainer" containerID="2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.240776 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/91044e10-c1e5-479f-b863-ea6bf220fff3-env-overrides\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.240826 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-host-run-netns\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.240945 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-run-openvswitch\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.241036 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-var-lib-openvswitch\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.241101 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/91044e10-c1e5-479f-b863-ea6bf220fff3-ovnkube-config\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.241137 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-etc-openvswitch\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.241184 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-host-run-ovn-kubernetes\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.241257 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/91044e10-c1e5-479f-b863-ea6bf220fff3-ovnkube-script-lib\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.241295 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/91044e10-c1e5-479f-b863-ea6bf220fff3-ovn-node-metrics-cert\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.241408 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-run-ovn\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.241446 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-host-slash\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.241580 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-systemd-units\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.241650 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-log-socket\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.241754 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-run-systemd\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.241811 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb75m\" (UniqueName: \"kubernetes.io/projected/91044e10-c1e5-479f-b863-ea6bf220fff3-kube-api-access-gb75m\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.241889 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-host-cni-bin\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.242233 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-node-log\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.242262 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-host-kubelet\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.242316 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.242336 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-host-cni-netd\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.242456 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzvft\" (UniqueName: \"kubernetes.io/projected/66853c8a-9391-4291-b5f1-c72cb5fe23e8-kube-api-access-tzvft\") on node \"crc\" DevicePath \"\"" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.242470 4735 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.242479 4735 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.242487 4735 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.242497 4735 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-node-log\") on node \"crc\" DevicePath \"\"" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.242505 4735 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.242514 4735 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/66853c8a-9391-4291-b5f1-c72cb5fe23e8-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.242525 4735 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.242533 4735 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.242541 4735 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/66853c8a-9391-4291-b5f1-c72cb5fe23e8-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.242551 4735 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.242561 4735 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.242570 4735 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.242580 4735 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-log-socket\") on node \"crc\" DevicePath \"\"" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.242588 4735 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/66853c8a-9391-4291-b5f1-c72cb5fe23e8-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.242596 4735 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.242604 4735 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/66853c8a-9391-4291-b5f1-c72cb5fe23e8-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.242612 4735 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/66853c8a-9391-4291-b5f1-c72cb5fe23e8-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.265829 4735 scope.go:117] "RemoveContainer" containerID="b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.288097 4735 scope.go:117] "RemoveContainer" containerID="8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.307794 4735 scope.go:117] "RemoveContainer" containerID="b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.324157 4735 scope.go:117] "RemoveContainer" containerID="9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.343659 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-etc-openvswitch\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.343735 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-host-run-ovn-kubernetes\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.343801 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/91044e10-c1e5-479f-b863-ea6bf220fff3-ovnkube-script-lib\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.343840 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/91044e10-c1e5-479f-b863-ea6bf220fff3-ovn-node-metrics-cert\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.343920 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-run-ovn\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.343957 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-host-slash\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.343988 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-systemd-units\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.344052 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-log-socket\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.344092 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-run-systemd\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.344229 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb75m\" (UniqueName: \"kubernetes.io/projected/91044e10-c1e5-479f-b863-ea6bf220fff3-kube-api-access-gb75m\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.344288 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-host-cni-bin\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.344324 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-node-log\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.344355 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-host-kubelet\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.344398 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.344436 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-host-cni-netd\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.344477 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/91044e10-c1e5-479f-b863-ea6bf220fff3-env-overrides\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.344510 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-host-run-netns\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.344581 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-run-openvswitch\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.344623 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-var-lib-openvswitch\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.344653 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/91044e10-c1e5-479f-b863-ea6bf220fff3-ovnkube-config\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.345595 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-etc-openvswitch\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.345654 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-host-run-ovn-kubernetes\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.345966 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/91044e10-c1e5-479f-b863-ea6bf220fff3-ovnkube-config\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.346061 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-host-cni-bin\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.346207 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/91044e10-c1e5-479f-b863-ea6bf220fff3-ovnkube-script-lib\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.346254 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-node-log\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.346249 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-systemd-units\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.346294 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-run-ovn\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.346364 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-host-cni-netd\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.346373 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.346300 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-host-kubelet\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.346443 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-run-openvswitch\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.346464 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-host-run-netns\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.346495 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-log-socket\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.346505 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-var-lib-openvswitch\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.346791 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-run-systemd\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.346911 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/91044e10-c1e5-479f-b863-ea6bf220fff3-env-overrides\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.346429 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/91044e10-c1e5-479f-b863-ea6bf220fff3-host-slash\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.349433 4735 scope.go:117] "RemoveContainer" containerID="1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.353263 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/91044e10-c1e5-479f-b863-ea6bf220fff3-ovn-node-metrics-cert\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.362774 4735 scope.go:117] "RemoveContainer" containerID="d86577bcf7609c1352ab606dcac0b9e98198dfdfa5e9d9d98e8aaad6ce7f2e1e" Feb 23 00:16:48 crc kubenswrapper[4735]: E0223 00:16:48.363584 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d86577bcf7609c1352ab606dcac0b9e98198dfdfa5e9d9d98e8aaad6ce7f2e1e\": container with ID starting with d86577bcf7609c1352ab606dcac0b9e98198dfdfa5e9d9d98e8aaad6ce7f2e1e not found: ID does not exist" containerID="d86577bcf7609c1352ab606dcac0b9e98198dfdfa5e9d9d98e8aaad6ce7f2e1e" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.363646 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d86577bcf7609c1352ab606dcac0b9e98198dfdfa5e9d9d98e8aaad6ce7f2e1e"} err="failed to get container status \"d86577bcf7609c1352ab606dcac0b9e98198dfdfa5e9d9d98e8aaad6ce7f2e1e\": rpc error: code = NotFound desc = could not find container \"d86577bcf7609c1352ab606dcac0b9e98198dfdfa5e9d9d98e8aaad6ce7f2e1e\": container with ID starting with d86577bcf7609c1352ab606dcac0b9e98198dfdfa5e9d9d98e8aaad6ce7f2e1e not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.363684 4735 scope.go:117] "RemoveContainer" containerID="dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33" Feb 23 00:16:48 crc kubenswrapper[4735]: E0223 00:16:48.364059 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33\": container with ID starting with dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33 not found: ID does not exist" containerID="dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.364102 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33"} err="failed to get container status \"dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33\": rpc error: code = NotFound desc = could not find container \"dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33\": container with ID starting with dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33 not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.364129 4735 scope.go:117] "RemoveContainer" containerID="fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f" Feb 23 00:16:48 crc kubenswrapper[4735]: E0223 00:16:48.364389 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f\": container with ID starting with fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f not found: ID does not exist" containerID="fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.364423 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f"} err="failed to get container status \"fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f\": rpc error: code = NotFound desc = could not find container \"fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f\": container with ID starting with fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.364445 4735 scope.go:117] "RemoveContainer" containerID="a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527" Feb 23 00:16:48 crc kubenswrapper[4735]: E0223 00:16:48.364746 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527\": container with ID starting with a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527 not found: ID does not exist" containerID="a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.364792 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527"} err="failed to get container status \"a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527\": rpc error: code = NotFound desc = could not find container \"a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527\": container with ID starting with a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527 not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.364820 4735 scope.go:117] "RemoveContainer" containerID="2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd" Feb 23 00:16:48 crc kubenswrapper[4735]: E0223 00:16:48.365105 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd\": container with ID starting with 2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd not found: ID does not exist" containerID="2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.365136 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd"} err="failed to get container status \"2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd\": rpc error: code = NotFound desc = could not find container \"2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd\": container with ID starting with 2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.365162 4735 scope.go:117] "RemoveContainer" containerID="b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716" Feb 23 00:16:48 crc kubenswrapper[4735]: E0223 00:16:48.365525 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716\": container with ID starting with b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716 not found: ID does not exist" containerID="b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.365568 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716"} err="failed to get container status \"b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716\": rpc error: code = NotFound desc = could not find container \"b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716\": container with ID starting with b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716 not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.365595 4735 scope.go:117] "RemoveContainer" containerID="8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.366032 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb75m\" (UniqueName: \"kubernetes.io/projected/91044e10-c1e5-479f-b863-ea6bf220fff3-kube-api-access-gb75m\") pod \"ovnkube-node-j82tt\" (UID: \"91044e10-c1e5-479f-b863-ea6bf220fff3\") " pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: E0223 00:16:48.366310 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442\": container with ID starting with 8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442 not found: ID does not exist" containerID="8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.366348 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442"} err="failed to get container status \"8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442\": rpc error: code = NotFound desc = could not find container \"8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442\": container with ID starting with 8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442 not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.366371 4735 scope.go:117] "RemoveContainer" containerID="b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7" Feb 23 00:16:48 crc kubenswrapper[4735]: E0223 00:16:48.366638 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7\": container with ID starting with b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7 not found: ID does not exist" containerID="b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.366671 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7"} err="failed to get container status \"b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7\": rpc error: code = NotFound desc = could not find container \"b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7\": container with ID starting with b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7 not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.366691 4735 scope.go:117] "RemoveContainer" containerID="9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2" Feb 23 00:16:48 crc kubenswrapper[4735]: E0223 00:16:48.367027 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2\": container with ID starting with 9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2 not found: ID does not exist" containerID="9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.367088 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2"} err="failed to get container status \"9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2\": rpc error: code = NotFound desc = could not find container \"9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2\": container with ID starting with 9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2 not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.367109 4735 scope.go:117] "RemoveContainer" containerID="1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee" Feb 23 00:16:48 crc kubenswrapper[4735]: E0223 00:16:48.367547 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\": container with ID starting with 1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee not found: ID does not exist" containerID="1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.367579 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee"} err="failed to get container status \"1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\": rpc error: code = NotFound desc = could not find container \"1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\": container with ID starting with 1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.367597 4735 scope.go:117] "RemoveContainer" containerID="d86577bcf7609c1352ab606dcac0b9e98198dfdfa5e9d9d98e8aaad6ce7f2e1e" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.367879 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d86577bcf7609c1352ab606dcac0b9e98198dfdfa5e9d9d98e8aaad6ce7f2e1e"} err="failed to get container status \"d86577bcf7609c1352ab606dcac0b9e98198dfdfa5e9d9d98e8aaad6ce7f2e1e\": rpc error: code = NotFound desc = could not find container \"d86577bcf7609c1352ab606dcac0b9e98198dfdfa5e9d9d98e8aaad6ce7f2e1e\": container with ID starting with d86577bcf7609c1352ab606dcac0b9e98198dfdfa5e9d9d98e8aaad6ce7f2e1e not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.367910 4735 scope.go:117] "RemoveContainer" containerID="dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.368226 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33"} err="failed to get container status \"dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33\": rpc error: code = NotFound desc = could not find container \"dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33\": container with ID starting with dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33 not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.368249 4735 scope.go:117] "RemoveContainer" containerID="fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.368550 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f"} err="failed to get container status \"fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f\": rpc error: code = NotFound desc = could not find container \"fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f\": container with ID starting with fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.368587 4735 scope.go:117] "RemoveContainer" containerID="a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.368935 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527"} err="failed to get container status \"a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527\": rpc error: code = NotFound desc = could not find container \"a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527\": container with ID starting with a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527 not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.368959 4735 scope.go:117] "RemoveContainer" containerID="2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.369254 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd"} err="failed to get container status \"2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd\": rpc error: code = NotFound desc = could not find container \"2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd\": container with ID starting with 2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.369295 4735 scope.go:117] "RemoveContainer" containerID="b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.369576 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716"} err="failed to get container status \"b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716\": rpc error: code = NotFound desc = could not find container \"b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716\": container with ID starting with b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716 not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.369606 4735 scope.go:117] "RemoveContainer" containerID="8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.369892 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442"} err="failed to get container status \"8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442\": rpc error: code = NotFound desc = could not find container \"8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442\": container with ID starting with 8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442 not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.369919 4735 scope.go:117] "RemoveContainer" containerID="b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.370173 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7"} err="failed to get container status \"b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7\": rpc error: code = NotFound desc = could not find container \"b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7\": container with ID starting with b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7 not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.370250 4735 scope.go:117] "RemoveContainer" containerID="9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.370699 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2"} err="failed to get container status \"9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2\": rpc error: code = NotFound desc = could not find container \"9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2\": container with ID starting with 9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2 not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.370725 4735 scope.go:117] "RemoveContainer" containerID="1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.371105 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee"} err="failed to get container status \"1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\": rpc error: code = NotFound desc = could not find container \"1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\": container with ID starting with 1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.371133 4735 scope.go:117] "RemoveContainer" containerID="d86577bcf7609c1352ab606dcac0b9e98198dfdfa5e9d9d98e8aaad6ce7f2e1e" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.371422 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d86577bcf7609c1352ab606dcac0b9e98198dfdfa5e9d9d98e8aaad6ce7f2e1e"} err="failed to get container status \"d86577bcf7609c1352ab606dcac0b9e98198dfdfa5e9d9d98e8aaad6ce7f2e1e\": rpc error: code = NotFound desc = could not find container \"d86577bcf7609c1352ab606dcac0b9e98198dfdfa5e9d9d98e8aaad6ce7f2e1e\": container with ID starting with d86577bcf7609c1352ab606dcac0b9e98198dfdfa5e9d9d98e8aaad6ce7f2e1e not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.371458 4735 scope.go:117] "RemoveContainer" containerID="dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.371762 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33"} err="failed to get container status \"dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33\": rpc error: code = NotFound desc = could not find container \"dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33\": container with ID starting with dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33 not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.371788 4735 scope.go:117] "RemoveContainer" containerID="fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.372053 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f"} err="failed to get container status \"fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f\": rpc error: code = NotFound desc = could not find container \"fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f\": container with ID starting with fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.372079 4735 scope.go:117] "RemoveContainer" containerID="a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.372321 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527"} err="failed to get container status \"a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527\": rpc error: code = NotFound desc = could not find container \"a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527\": container with ID starting with a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527 not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.372347 4735 scope.go:117] "RemoveContainer" containerID="2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.372609 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd"} err="failed to get container status \"2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd\": rpc error: code = NotFound desc = could not find container \"2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd\": container with ID starting with 2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.372634 4735 scope.go:117] "RemoveContainer" containerID="b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.373066 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716"} err="failed to get container status \"b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716\": rpc error: code = NotFound desc = could not find container \"b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716\": container with ID starting with b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716 not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.373108 4735 scope.go:117] "RemoveContainer" containerID="8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.373379 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442"} err="failed to get container status \"8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442\": rpc error: code = NotFound desc = could not find container \"8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442\": container with ID starting with 8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442 not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.373404 4735 scope.go:117] "RemoveContainer" containerID="b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.373681 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7"} err="failed to get container status \"b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7\": rpc error: code = NotFound desc = could not find container \"b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7\": container with ID starting with b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7 not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.373712 4735 scope.go:117] "RemoveContainer" containerID="9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.374046 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2"} err="failed to get container status \"9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2\": rpc error: code = NotFound desc = could not find container \"9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2\": container with ID starting with 9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2 not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.374084 4735 scope.go:117] "RemoveContainer" containerID="1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.374375 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee"} err="failed to get container status \"1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\": rpc error: code = NotFound desc = could not find container \"1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\": container with ID starting with 1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.374404 4735 scope.go:117] "RemoveContainer" containerID="d86577bcf7609c1352ab606dcac0b9e98198dfdfa5e9d9d98e8aaad6ce7f2e1e" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.374669 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d86577bcf7609c1352ab606dcac0b9e98198dfdfa5e9d9d98e8aaad6ce7f2e1e"} err="failed to get container status \"d86577bcf7609c1352ab606dcac0b9e98198dfdfa5e9d9d98e8aaad6ce7f2e1e\": rpc error: code = NotFound desc = could not find container \"d86577bcf7609c1352ab606dcac0b9e98198dfdfa5e9d9d98e8aaad6ce7f2e1e\": container with ID starting with d86577bcf7609c1352ab606dcac0b9e98198dfdfa5e9d9d98e8aaad6ce7f2e1e not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.374707 4735 scope.go:117] "RemoveContainer" containerID="dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.375005 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33"} err="failed to get container status \"dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33\": rpc error: code = NotFound desc = could not find container \"dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33\": container with ID starting with dca9bac923d652f6ba53fb96a23f576aaf01f489e0626f5714ab42d5b3f46d33 not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.375040 4735 scope.go:117] "RemoveContainer" containerID="fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.375303 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f"} err="failed to get container status \"fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f\": rpc error: code = NotFound desc = could not find container \"fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f\": container with ID starting with fab769dbc2020e2afe67801fcd72ac65c18241868a84a934f2c87b9b61542e1f not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.375332 4735 scope.go:117] "RemoveContainer" containerID="a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.375603 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527"} err="failed to get container status \"a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527\": rpc error: code = NotFound desc = could not find container \"a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527\": container with ID starting with a3908dda88f4ce82573838caf7077839204bd413828eb1f3c1fd37ad19165527 not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.375628 4735 scope.go:117] "RemoveContainer" containerID="2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.375921 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd"} err="failed to get container status \"2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd\": rpc error: code = NotFound desc = could not find container \"2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd\": container with ID starting with 2b684c9c8b1ca3149990fc1545bce5369f28f173b705675e41accef012f9bacd not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.375948 4735 scope.go:117] "RemoveContainer" containerID="b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.376311 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716"} err="failed to get container status \"b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716\": rpc error: code = NotFound desc = could not find container \"b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716\": container with ID starting with b51080ead4b9d5d091505ca67ed7d3c63e5053890785fa051fc4953f4fb48716 not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.376337 4735 scope.go:117] "RemoveContainer" containerID="8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.376669 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442"} err="failed to get container status \"8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442\": rpc error: code = NotFound desc = could not find container \"8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442\": container with ID starting with 8f9d23e41da2b23d5d5a50c887fe5218a3eb0d327468ad971f03954953b09442 not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.376698 4735 scope.go:117] "RemoveContainer" containerID="b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.376986 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7"} err="failed to get container status \"b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7\": rpc error: code = NotFound desc = could not find container \"b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7\": container with ID starting with b791e4846d79877281521b0aef97589efd37eff3d9d354adff50dc19def3dfc7 not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.377012 4735 scope.go:117] "RemoveContainer" containerID="9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.377373 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2"} err="failed to get container status \"9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2\": rpc error: code = NotFound desc = could not find container \"9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2\": container with ID starting with 9a73a6e28cd3a21e5fabf002e6024cb0b52cff6c93ee857efa06fc6163ca03f2 not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.377399 4735 scope.go:117] "RemoveContainer" containerID="1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.377768 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee"} err="failed to get container status \"1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\": rpc error: code = NotFound desc = could not find container \"1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee\": container with ID starting with 1db57f90dfdb2c4b17e6be3f9c3483f2fea8a164d34666929c88ff41cfc2f3ee not found: ID does not exist" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.383227 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.507521 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-59rkm"] Feb 23 00:16:48 crc kubenswrapper[4735]: I0223 00:16:48.512807 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-59rkm"] Feb 23 00:16:49 crc kubenswrapper[4735]: I0223 00:16:49.169124 4735 generic.go:334] "Generic (PLEG): container finished" podID="91044e10-c1e5-479f-b863-ea6bf220fff3" containerID="66d001b6edcfdd2d4823a48d5acd5bfa7d5fe30784408c16636acf49377e8c80" exitCode=0 Feb 23 00:16:49 crc kubenswrapper[4735]: I0223 00:16:49.169278 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" event={"ID":"91044e10-c1e5-479f-b863-ea6bf220fff3","Type":"ContainerDied","Data":"66d001b6edcfdd2d4823a48d5acd5bfa7d5fe30784408c16636acf49377e8c80"} Feb 23 00:16:49 crc kubenswrapper[4735]: I0223 00:16:49.169673 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" event={"ID":"91044e10-c1e5-479f-b863-ea6bf220fff3","Type":"ContainerStarted","Data":"c5633540ddd5fd47dd0216f69db823ffabefe067726e3f07635065d572f9e8ca"} Feb 23 00:16:49 crc kubenswrapper[4735]: I0223 00:16:49.176162 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4gvxr_5b63c18f-b6b2-4d97-b542-7800b475bd4c/kube-multus/2.log" Feb 23 00:16:50 crc kubenswrapper[4735]: I0223 00:16:50.189895 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" event={"ID":"91044e10-c1e5-479f-b863-ea6bf220fff3","Type":"ContainerStarted","Data":"b358a2be2dff6555814918e9f170f17a8497ad64e2af72c0af6e87699f0c55fe"} Feb 23 00:16:50 crc kubenswrapper[4735]: I0223 00:16:50.189972 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" event={"ID":"91044e10-c1e5-479f-b863-ea6bf220fff3","Type":"ContainerStarted","Data":"814bd8ec4c669c7c28c2092469ebc9f3cf994fcc3546830b1fd094371d86b2e9"} Feb 23 00:16:50 crc kubenswrapper[4735]: I0223 00:16:50.189991 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" event={"ID":"91044e10-c1e5-479f-b863-ea6bf220fff3","Type":"ContainerStarted","Data":"00ff1c2f4bb026c6f11327b4a696efc46d59e798ada2853430e6c76d4e18d147"} Feb 23 00:16:50 crc kubenswrapper[4735]: I0223 00:16:50.190008 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" event={"ID":"91044e10-c1e5-479f-b863-ea6bf220fff3","Type":"ContainerStarted","Data":"136fd0d78c91883f3d3058e4e6b72bb8e929c2120f501e50e5c9690fbbdb4b0d"} Feb 23 00:16:50 crc kubenswrapper[4735]: I0223 00:16:50.190022 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" event={"ID":"91044e10-c1e5-479f-b863-ea6bf220fff3","Type":"ContainerStarted","Data":"8044024ae01cf4b59e5fa0e95ed9dd2b3af3a9d9d3b8583daaba894ce3513365"} Feb 23 00:16:50 crc kubenswrapper[4735]: I0223 00:16:50.190043 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" event={"ID":"91044e10-c1e5-479f-b863-ea6bf220fff3","Type":"ContainerStarted","Data":"2d71b5ca3479432bed2c85c274add09ea4f330e87a1cf9fbe1eaf70b62a2b8d9"} Feb 23 00:16:50 crc kubenswrapper[4735]: I0223 00:16:50.282598 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66853c8a-9391-4291-b5f1-c72cb5fe23e8" path="/var/lib/kubelet/pods/66853c8a-9391-4291-b5f1-c72cb5fe23e8/volumes" Feb 23 00:16:53 crc kubenswrapper[4735]: I0223 00:16:53.216697 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" event={"ID":"91044e10-c1e5-479f-b863-ea6bf220fff3","Type":"ContainerStarted","Data":"39616a2474fdc0ae2dd2a5a97eb17a6d5b88fc5eb0a2ce7def040a968fb7aae4"} Feb 23 00:16:55 crc kubenswrapper[4735]: I0223 00:16:55.233608 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" event={"ID":"91044e10-c1e5-479f-b863-ea6bf220fff3","Type":"ContainerStarted","Data":"343a512f62c8b708a0443c5baddb848889c7fdbdf2ed1892a31ab29ab2b1759d"} Feb 23 00:16:55 crc kubenswrapper[4735]: I0223 00:16:55.233990 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:55 crc kubenswrapper[4735]: I0223 00:16:55.259946 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" podStartSLOduration=7.259929075 podStartE2EDuration="7.259929075s" podCreationTimestamp="2026-02-23 00:16:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:16:55.257693205 +0000 UTC m=+573.721239176" watchObservedRunningTime="2026-02-23 00:16:55.259929075 +0000 UTC m=+573.723475046" Feb 23 00:16:55 crc kubenswrapper[4735]: I0223 00:16:55.270030 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:56 crc kubenswrapper[4735]: I0223 00:16:56.241256 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:56 crc kubenswrapper[4735]: I0223 00:16:56.241725 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:16:56 crc kubenswrapper[4735]: I0223 00:16:56.287643 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:17:02 crc kubenswrapper[4735]: I0223 00:17:02.278050 4735 scope.go:117] "RemoveContainer" containerID="89011481e42515946009b35bf0cf23e12f73377615e05b69a8471c4968e6bc01" Feb 23 00:17:02 crc kubenswrapper[4735]: E0223 00:17:02.278974 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-4gvxr_openshift-multus(5b63c18f-b6b2-4d97-b542-7800b475bd4c)\"" pod="openshift-multus/multus-4gvxr" podUID="5b63c18f-b6b2-4d97-b542-7800b475bd4c" Feb 23 00:17:11 crc kubenswrapper[4735]: I0223 00:17:11.513133 4735 patch_prober.go:28] interesting pod/machine-config-daemon-blmnv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 00:17:11 crc kubenswrapper[4735]: I0223 00:17:11.513962 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" podUID="1cba474f-2d55-4a07-969f-25e2817a06d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 00:17:11 crc kubenswrapper[4735]: I0223 00:17:11.514036 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" Feb 23 00:17:11 crc kubenswrapper[4735]: I0223 00:17:11.515130 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9ffff5ef8ce3a35f166e6769e8ce86e4bf9ce64b374895f07abf527d84d7182c"} pod="openshift-machine-config-operator/machine-config-daemon-blmnv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 00:17:11 crc kubenswrapper[4735]: I0223 00:17:11.515238 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" podUID="1cba474f-2d55-4a07-969f-25e2817a06d0" containerName="machine-config-daemon" containerID="cri-o://9ffff5ef8ce3a35f166e6769e8ce86e4bf9ce64b374895f07abf527d84d7182c" gracePeriod=600 Feb 23 00:17:12 crc kubenswrapper[4735]: I0223 00:17:12.353241 4735 generic.go:334] "Generic (PLEG): container finished" podID="1cba474f-2d55-4a07-969f-25e2817a06d0" containerID="9ffff5ef8ce3a35f166e6769e8ce86e4bf9ce64b374895f07abf527d84d7182c" exitCode=0 Feb 23 00:17:12 crc kubenswrapper[4735]: I0223 00:17:12.353351 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" event={"ID":"1cba474f-2d55-4a07-969f-25e2817a06d0","Type":"ContainerDied","Data":"9ffff5ef8ce3a35f166e6769e8ce86e4bf9ce64b374895f07abf527d84d7182c"} Feb 23 00:17:12 crc kubenswrapper[4735]: I0223 00:17:12.353662 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" event={"ID":"1cba474f-2d55-4a07-969f-25e2817a06d0","Type":"ContainerStarted","Data":"6d18ff42046d8089570f691ef425e2b8cbc857ffa40454ed7ce709bd6b34ea17"} Feb 23 00:17:12 crc kubenswrapper[4735]: I0223 00:17:12.353688 4735 scope.go:117] "RemoveContainer" containerID="32a3e61de17574fe655e2f95d40d79b68f89d06de483f3a878f524bc13ce427d" Feb 23 00:17:13 crc kubenswrapper[4735]: I0223 00:17:13.272816 4735 scope.go:117] "RemoveContainer" containerID="89011481e42515946009b35bf0cf23e12f73377615e05b69a8471c4968e6bc01" Feb 23 00:17:14 crc kubenswrapper[4735]: I0223 00:17:14.373792 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4gvxr_5b63c18f-b6b2-4d97-b542-7800b475bd4c/kube-multus/2.log" Feb 23 00:17:14 crc kubenswrapper[4735]: I0223 00:17:14.374180 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4gvxr" event={"ID":"5b63c18f-b6b2-4d97-b542-7800b475bd4c","Type":"ContainerStarted","Data":"766bf07a9047d2c63ea857951dd9611ffbbc88e5b458bc92c1d318a2726c03c8"} Feb 23 00:17:18 crc kubenswrapper[4735]: I0223 00:17:18.405800 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j82tt" Feb 23 00:17:22 crc kubenswrapper[4735]: I0223 00:17:22.638953 4735 scope.go:117] "RemoveContainer" containerID="34133bc33001e803294fef5a1e7de9bb302dd8d3c3eb47a331df115da0f95447" Feb 23 00:17:22 crc kubenswrapper[4735]: I0223 00:17:22.664426 4735 scope.go:117] "RemoveContainer" containerID="5b94d40e6164b0ed2e6ce9c8b5d41d1fd2dda247666febe6f678712cc7880357" Feb 23 00:17:55 crc kubenswrapper[4735]: I0223 00:17:55.680356 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6xh6c"] Feb 23 00:17:55 crc kubenswrapper[4735]: I0223 00:17:55.682027 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6xh6c" podUID="faa0a6f2-8a6e-4c82-b876-9cc1cff42496" containerName="registry-server" containerID="cri-o://3bd5e96205bb309bd24e97cb77f92dc3fa0c3bfed548bae7d8c23da1096f6ff7" gracePeriod=30 Feb 23 00:17:56 crc kubenswrapper[4735]: I0223 00:17:56.115643 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6xh6c" Feb 23 00:17:56 crc kubenswrapper[4735]: I0223 00:17:56.315590 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faa0a6f2-8a6e-4c82-b876-9cc1cff42496-utilities\") pod \"faa0a6f2-8a6e-4c82-b876-9cc1cff42496\" (UID: \"faa0a6f2-8a6e-4c82-b876-9cc1cff42496\") " Feb 23 00:17:56 crc kubenswrapper[4735]: I0223 00:17:56.315702 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2256\" (UniqueName: \"kubernetes.io/projected/faa0a6f2-8a6e-4c82-b876-9cc1cff42496-kube-api-access-c2256\") pod \"faa0a6f2-8a6e-4c82-b876-9cc1cff42496\" (UID: \"faa0a6f2-8a6e-4c82-b876-9cc1cff42496\") " Feb 23 00:17:56 crc kubenswrapper[4735]: I0223 00:17:56.316294 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faa0a6f2-8a6e-4c82-b876-9cc1cff42496-catalog-content\") pod \"faa0a6f2-8a6e-4c82-b876-9cc1cff42496\" (UID: \"faa0a6f2-8a6e-4c82-b876-9cc1cff42496\") " Feb 23 00:17:56 crc kubenswrapper[4735]: I0223 00:17:56.318398 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/faa0a6f2-8a6e-4c82-b876-9cc1cff42496-utilities" (OuterVolumeSpecName: "utilities") pod "faa0a6f2-8a6e-4c82-b876-9cc1cff42496" (UID: "faa0a6f2-8a6e-4c82-b876-9cc1cff42496"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:17:56 crc kubenswrapper[4735]: I0223 00:17:56.323096 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faa0a6f2-8a6e-4c82-b876-9cc1cff42496-kube-api-access-c2256" (OuterVolumeSpecName: "kube-api-access-c2256") pod "faa0a6f2-8a6e-4c82-b876-9cc1cff42496" (UID: "faa0a6f2-8a6e-4c82-b876-9cc1cff42496"). InnerVolumeSpecName "kube-api-access-c2256". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:17:56 crc kubenswrapper[4735]: I0223 00:17:56.362511 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/faa0a6f2-8a6e-4c82-b876-9cc1cff42496-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "faa0a6f2-8a6e-4c82-b876-9cc1cff42496" (UID: "faa0a6f2-8a6e-4c82-b876-9cc1cff42496"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:17:56 crc kubenswrapper[4735]: I0223 00:17:56.417770 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faa0a6f2-8a6e-4c82-b876-9cc1cff42496-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 00:17:56 crc kubenswrapper[4735]: I0223 00:17:56.418152 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2256\" (UniqueName: \"kubernetes.io/projected/faa0a6f2-8a6e-4c82-b876-9cc1cff42496-kube-api-access-c2256\") on node \"crc\" DevicePath \"\"" Feb 23 00:17:56 crc kubenswrapper[4735]: I0223 00:17:56.418263 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faa0a6f2-8a6e-4c82-b876-9cc1cff42496-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 00:17:56 crc kubenswrapper[4735]: I0223 00:17:56.665540 4735 generic.go:334] "Generic (PLEG): container finished" podID="faa0a6f2-8a6e-4c82-b876-9cc1cff42496" containerID="3bd5e96205bb309bd24e97cb77f92dc3fa0c3bfed548bae7d8c23da1096f6ff7" exitCode=0 Feb 23 00:17:56 crc kubenswrapper[4735]: I0223 00:17:56.665607 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6xh6c" event={"ID":"faa0a6f2-8a6e-4c82-b876-9cc1cff42496","Type":"ContainerDied","Data":"3bd5e96205bb309bd24e97cb77f92dc3fa0c3bfed548bae7d8c23da1096f6ff7"} Feb 23 00:17:56 crc kubenswrapper[4735]: I0223 00:17:56.665670 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6xh6c" event={"ID":"faa0a6f2-8a6e-4c82-b876-9cc1cff42496","Type":"ContainerDied","Data":"eb0eeb02ab2cb5fc6b6d6f2f07a9dd7db08a20387ac0d572e54966266f4d0c44"} Feb 23 00:17:56 crc kubenswrapper[4735]: I0223 00:17:56.665687 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6xh6c" Feb 23 00:17:56 crc kubenswrapper[4735]: I0223 00:17:56.665702 4735 scope.go:117] "RemoveContainer" containerID="3bd5e96205bb309bd24e97cb77f92dc3fa0c3bfed548bae7d8c23da1096f6ff7" Feb 23 00:17:56 crc kubenswrapper[4735]: I0223 00:17:56.696127 4735 scope.go:117] "RemoveContainer" containerID="2e3d02f2cb2f4484f35a457dbf5dd54200165dbe9807a50fcffc02719cc529fb" Feb 23 00:17:56 crc kubenswrapper[4735]: I0223 00:17:56.715339 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6xh6c"] Feb 23 00:17:56 crc kubenswrapper[4735]: I0223 00:17:56.723611 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6xh6c"] Feb 23 00:17:56 crc kubenswrapper[4735]: I0223 00:17:56.744360 4735 scope.go:117] "RemoveContainer" containerID="8c389cbb5c056bec21981f5cdad2c756e85c2f9f9880f923c97b5a9fca2aa3a2" Feb 23 00:17:56 crc kubenswrapper[4735]: I0223 00:17:56.763064 4735 scope.go:117] "RemoveContainer" containerID="3bd5e96205bb309bd24e97cb77f92dc3fa0c3bfed548bae7d8c23da1096f6ff7" Feb 23 00:17:56 crc kubenswrapper[4735]: E0223 00:17:56.763601 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bd5e96205bb309bd24e97cb77f92dc3fa0c3bfed548bae7d8c23da1096f6ff7\": container with ID starting with 3bd5e96205bb309bd24e97cb77f92dc3fa0c3bfed548bae7d8c23da1096f6ff7 not found: ID does not exist" containerID="3bd5e96205bb309bd24e97cb77f92dc3fa0c3bfed548bae7d8c23da1096f6ff7" Feb 23 00:17:56 crc kubenswrapper[4735]: I0223 00:17:56.763654 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bd5e96205bb309bd24e97cb77f92dc3fa0c3bfed548bae7d8c23da1096f6ff7"} err="failed to get container status \"3bd5e96205bb309bd24e97cb77f92dc3fa0c3bfed548bae7d8c23da1096f6ff7\": rpc error: code = NotFound desc = could not find container \"3bd5e96205bb309bd24e97cb77f92dc3fa0c3bfed548bae7d8c23da1096f6ff7\": container with ID starting with 3bd5e96205bb309bd24e97cb77f92dc3fa0c3bfed548bae7d8c23da1096f6ff7 not found: ID does not exist" Feb 23 00:17:56 crc kubenswrapper[4735]: I0223 00:17:56.763693 4735 scope.go:117] "RemoveContainer" containerID="2e3d02f2cb2f4484f35a457dbf5dd54200165dbe9807a50fcffc02719cc529fb" Feb 23 00:17:56 crc kubenswrapper[4735]: E0223 00:17:56.764055 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e3d02f2cb2f4484f35a457dbf5dd54200165dbe9807a50fcffc02719cc529fb\": container with ID starting with 2e3d02f2cb2f4484f35a457dbf5dd54200165dbe9807a50fcffc02719cc529fb not found: ID does not exist" containerID="2e3d02f2cb2f4484f35a457dbf5dd54200165dbe9807a50fcffc02719cc529fb" Feb 23 00:17:56 crc kubenswrapper[4735]: I0223 00:17:56.764101 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e3d02f2cb2f4484f35a457dbf5dd54200165dbe9807a50fcffc02719cc529fb"} err="failed to get container status \"2e3d02f2cb2f4484f35a457dbf5dd54200165dbe9807a50fcffc02719cc529fb\": rpc error: code = NotFound desc = could not find container \"2e3d02f2cb2f4484f35a457dbf5dd54200165dbe9807a50fcffc02719cc529fb\": container with ID starting with 2e3d02f2cb2f4484f35a457dbf5dd54200165dbe9807a50fcffc02719cc529fb not found: ID does not exist" Feb 23 00:17:56 crc kubenswrapper[4735]: I0223 00:17:56.764125 4735 scope.go:117] "RemoveContainer" containerID="8c389cbb5c056bec21981f5cdad2c756e85c2f9f9880f923c97b5a9fca2aa3a2" Feb 23 00:17:56 crc kubenswrapper[4735]: E0223 00:17:56.764473 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c389cbb5c056bec21981f5cdad2c756e85c2f9f9880f923c97b5a9fca2aa3a2\": container with ID starting with 8c389cbb5c056bec21981f5cdad2c756e85c2f9f9880f923c97b5a9fca2aa3a2 not found: ID does not exist" containerID="8c389cbb5c056bec21981f5cdad2c756e85c2f9f9880f923c97b5a9fca2aa3a2" Feb 23 00:17:56 crc kubenswrapper[4735]: I0223 00:17:56.764514 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c389cbb5c056bec21981f5cdad2c756e85c2f9f9880f923c97b5a9fca2aa3a2"} err="failed to get container status \"8c389cbb5c056bec21981f5cdad2c756e85c2f9f9880f923c97b5a9fca2aa3a2\": rpc error: code = NotFound desc = could not find container \"8c389cbb5c056bec21981f5cdad2c756e85c2f9f9880f923c97b5a9fca2aa3a2\": container with ID starting with 8c389cbb5c056bec21981f5cdad2c756e85c2f9f9880f923c97b5a9fca2aa3a2 not found: ID does not exist" Feb 23 00:17:58 crc kubenswrapper[4735]: I0223 00:17:58.278721 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faa0a6f2-8a6e-4c82-b876-9cc1cff42496" path="/var/lib/kubelet/pods/faa0a6f2-8a6e-4c82-b876-9cc1cff42496/volumes" Feb 23 00:17:59 crc kubenswrapper[4735]: I0223 00:17:59.209180 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8"] Feb 23 00:17:59 crc kubenswrapper[4735]: E0223 00:17:59.209394 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faa0a6f2-8a6e-4c82-b876-9cc1cff42496" containerName="registry-server" Feb 23 00:17:59 crc kubenswrapper[4735]: I0223 00:17:59.209409 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="faa0a6f2-8a6e-4c82-b876-9cc1cff42496" containerName="registry-server" Feb 23 00:17:59 crc kubenswrapper[4735]: E0223 00:17:59.209422 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faa0a6f2-8a6e-4c82-b876-9cc1cff42496" containerName="extract-content" Feb 23 00:17:59 crc kubenswrapper[4735]: I0223 00:17:59.209429 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="faa0a6f2-8a6e-4c82-b876-9cc1cff42496" containerName="extract-content" Feb 23 00:17:59 crc kubenswrapper[4735]: E0223 00:17:59.209445 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faa0a6f2-8a6e-4c82-b876-9cc1cff42496" containerName="extract-utilities" Feb 23 00:17:59 crc kubenswrapper[4735]: I0223 00:17:59.209453 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="faa0a6f2-8a6e-4c82-b876-9cc1cff42496" containerName="extract-utilities" Feb 23 00:17:59 crc kubenswrapper[4735]: I0223 00:17:59.209570 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="faa0a6f2-8a6e-4c82-b876-9cc1cff42496" containerName="registry-server" Feb 23 00:17:59 crc kubenswrapper[4735]: I0223 00:17:59.210536 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8" Feb 23 00:17:59 crc kubenswrapper[4735]: I0223 00:17:59.213345 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 23 00:17:59 crc kubenswrapper[4735]: I0223 00:17:59.231593 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8"] Feb 23 00:17:59 crc kubenswrapper[4735]: I0223 00:17:59.354411 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2367a963-af46-4385-b85d-75ab46713b1f-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8\" (UID: \"2367a963-af46-4385-b85d-75ab46713b1f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8" Feb 23 00:17:59 crc kubenswrapper[4735]: I0223 00:17:59.354686 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2367a963-af46-4385-b85d-75ab46713b1f-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8\" (UID: \"2367a963-af46-4385-b85d-75ab46713b1f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8" Feb 23 00:17:59 crc kubenswrapper[4735]: I0223 00:17:59.354717 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqrmj\" (UniqueName: \"kubernetes.io/projected/2367a963-af46-4385-b85d-75ab46713b1f-kube-api-access-nqrmj\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8\" (UID: \"2367a963-af46-4385-b85d-75ab46713b1f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8" Feb 23 00:17:59 crc kubenswrapper[4735]: I0223 00:17:59.455620 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2367a963-af46-4385-b85d-75ab46713b1f-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8\" (UID: \"2367a963-af46-4385-b85d-75ab46713b1f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8" Feb 23 00:17:59 crc kubenswrapper[4735]: I0223 00:17:59.455704 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2367a963-af46-4385-b85d-75ab46713b1f-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8\" (UID: \"2367a963-af46-4385-b85d-75ab46713b1f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8" Feb 23 00:17:59 crc kubenswrapper[4735]: I0223 00:17:59.455727 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqrmj\" (UniqueName: \"kubernetes.io/projected/2367a963-af46-4385-b85d-75ab46713b1f-kube-api-access-nqrmj\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8\" (UID: \"2367a963-af46-4385-b85d-75ab46713b1f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8" Feb 23 00:17:59 crc kubenswrapper[4735]: I0223 00:17:59.456169 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2367a963-af46-4385-b85d-75ab46713b1f-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8\" (UID: \"2367a963-af46-4385-b85d-75ab46713b1f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8" Feb 23 00:17:59 crc kubenswrapper[4735]: I0223 00:17:59.456373 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2367a963-af46-4385-b85d-75ab46713b1f-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8\" (UID: \"2367a963-af46-4385-b85d-75ab46713b1f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8" Feb 23 00:17:59 crc kubenswrapper[4735]: I0223 00:17:59.477941 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqrmj\" (UniqueName: \"kubernetes.io/projected/2367a963-af46-4385-b85d-75ab46713b1f-kube-api-access-nqrmj\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8\" (UID: \"2367a963-af46-4385-b85d-75ab46713b1f\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8" Feb 23 00:17:59 crc kubenswrapper[4735]: I0223 00:17:59.541045 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8" Feb 23 00:18:00 crc kubenswrapper[4735]: I0223 00:18:00.035492 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8"] Feb 23 00:18:00 crc kubenswrapper[4735]: I0223 00:18:00.725619 4735 generic.go:334] "Generic (PLEG): container finished" podID="2367a963-af46-4385-b85d-75ab46713b1f" containerID="de268104e3e672b55ec837baa992ba02583dad4d9ae545d53c867f3ff6670e02" exitCode=0 Feb 23 00:18:00 crc kubenswrapper[4735]: I0223 00:18:00.725679 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8" event={"ID":"2367a963-af46-4385-b85d-75ab46713b1f","Type":"ContainerDied","Data":"de268104e3e672b55ec837baa992ba02583dad4d9ae545d53c867f3ff6670e02"} Feb 23 00:18:00 crc kubenswrapper[4735]: I0223 00:18:00.727029 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8" event={"ID":"2367a963-af46-4385-b85d-75ab46713b1f","Type":"ContainerStarted","Data":"1eff668e428ad10e8b3f1449b012d7819dc413ebb1f7d9f4b0d0c47d6178b3a2"} Feb 23 00:18:00 crc kubenswrapper[4735]: I0223 00:18:00.743560 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 00:18:02 crc kubenswrapper[4735]: I0223 00:18:02.744053 4735 generic.go:334] "Generic (PLEG): container finished" podID="2367a963-af46-4385-b85d-75ab46713b1f" containerID="5adeb535fee85b25ad51988732c73d6ca2852a2295ec389cd8f9ce7b9577bc2a" exitCode=0 Feb 23 00:18:02 crc kubenswrapper[4735]: I0223 00:18:02.744157 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8" event={"ID":"2367a963-af46-4385-b85d-75ab46713b1f","Type":"ContainerDied","Data":"5adeb535fee85b25ad51988732c73d6ca2852a2295ec389cd8f9ce7b9577bc2a"} Feb 23 00:18:03 crc kubenswrapper[4735]: I0223 00:18:03.755807 4735 generic.go:334] "Generic (PLEG): container finished" podID="2367a963-af46-4385-b85d-75ab46713b1f" containerID="5c8741963e7bea2bc28b2e929789d1e6a33a267142497c5b4983d5121292a70d" exitCode=0 Feb 23 00:18:03 crc kubenswrapper[4735]: I0223 00:18:03.755958 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8" event={"ID":"2367a963-af46-4385-b85d-75ab46713b1f","Type":"ContainerDied","Data":"5c8741963e7bea2bc28b2e929789d1e6a33a267142497c5b4983d5121292a70d"} Feb 23 00:18:05 crc kubenswrapper[4735]: I0223 00:18:05.073396 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8" Feb 23 00:18:05 crc kubenswrapper[4735]: I0223 00:18:05.150962 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqrmj\" (UniqueName: \"kubernetes.io/projected/2367a963-af46-4385-b85d-75ab46713b1f-kube-api-access-nqrmj\") pod \"2367a963-af46-4385-b85d-75ab46713b1f\" (UID: \"2367a963-af46-4385-b85d-75ab46713b1f\") " Feb 23 00:18:05 crc kubenswrapper[4735]: I0223 00:18:05.151165 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2367a963-af46-4385-b85d-75ab46713b1f-util\") pod \"2367a963-af46-4385-b85d-75ab46713b1f\" (UID: \"2367a963-af46-4385-b85d-75ab46713b1f\") " Feb 23 00:18:05 crc kubenswrapper[4735]: I0223 00:18:05.151396 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2367a963-af46-4385-b85d-75ab46713b1f-bundle\") pod \"2367a963-af46-4385-b85d-75ab46713b1f\" (UID: \"2367a963-af46-4385-b85d-75ab46713b1f\") " Feb 23 00:18:05 crc kubenswrapper[4735]: I0223 00:18:05.155542 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2367a963-af46-4385-b85d-75ab46713b1f-bundle" (OuterVolumeSpecName: "bundle") pod "2367a963-af46-4385-b85d-75ab46713b1f" (UID: "2367a963-af46-4385-b85d-75ab46713b1f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:18:05 crc kubenswrapper[4735]: I0223 00:18:05.156422 4735 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2367a963-af46-4385-b85d-75ab46713b1f-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 00:18:05 crc kubenswrapper[4735]: I0223 00:18:05.161096 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2367a963-af46-4385-b85d-75ab46713b1f-kube-api-access-nqrmj" (OuterVolumeSpecName: "kube-api-access-nqrmj") pod "2367a963-af46-4385-b85d-75ab46713b1f" (UID: "2367a963-af46-4385-b85d-75ab46713b1f"). InnerVolumeSpecName "kube-api-access-nqrmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:18:05 crc kubenswrapper[4735]: I0223 00:18:05.179176 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2367a963-af46-4385-b85d-75ab46713b1f-util" (OuterVolumeSpecName: "util") pod "2367a963-af46-4385-b85d-75ab46713b1f" (UID: "2367a963-af46-4385-b85d-75ab46713b1f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:18:05 crc kubenswrapper[4735]: I0223 00:18:05.258377 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqrmj\" (UniqueName: \"kubernetes.io/projected/2367a963-af46-4385-b85d-75ab46713b1f-kube-api-access-nqrmj\") on node \"crc\" DevicePath \"\"" Feb 23 00:18:05 crc kubenswrapper[4735]: I0223 00:18:05.258473 4735 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2367a963-af46-4385-b85d-75ab46713b1f-util\") on node \"crc\" DevicePath \"\"" Feb 23 00:18:05 crc kubenswrapper[4735]: I0223 00:18:05.770928 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8" event={"ID":"2367a963-af46-4385-b85d-75ab46713b1f","Type":"ContainerDied","Data":"1eff668e428ad10e8b3f1449b012d7819dc413ebb1f7d9f4b0d0c47d6178b3a2"} Feb 23 00:18:05 crc kubenswrapper[4735]: I0223 00:18:05.770997 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1eff668e428ad10e8b3f1449b012d7819dc413ebb1f7d9f4b0d0c47d6178b3a2" Feb 23 00:18:05 crc kubenswrapper[4735]: I0223 00:18:05.771000 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8" Feb 23 00:18:06 crc kubenswrapper[4735]: I0223 00:18:06.219642 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv"] Feb 23 00:18:06 crc kubenswrapper[4735]: E0223 00:18:06.221762 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2367a963-af46-4385-b85d-75ab46713b1f" containerName="util" Feb 23 00:18:06 crc kubenswrapper[4735]: I0223 00:18:06.222051 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2367a963-af46-4385-b85d-75ab46713b1f" containerName="util" Feb 23 00:18:06 crc kubenswrapper[4735]: E0223 00:18:06.222260 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2367a963-af46-4385-b85d-75ab46713b1f" containerName="pull" Feb 23 00:18:06 crc kubenswrapper[4735]: I0223 00:18:06.222399 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2367a963-af46-4385-b85d-75ab46713b1f" containerName="pull" Feb 23 00:18:06 crc kubenswrapper[4735]: E0223 00:18:06.222534 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2367a963-af46-4385-b85d-75ab46713b1f" containerName="extract" Feb 23 00:18:06 crc kubenswrapper[4735]: I0223 00:18:06.222654 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="2367a963-af46-4385-b85d-75ab46713b1f" containerName="extract" Feb 23 00:18:06 crc kubenswrapper[4735]: I0223 00:18:06.223009 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="2367a963-af46-4385-b85d-75ab46713b1f" containerName="extract" Feb 23 00:18:06 crc kubenswrapper[4735]: I0223 00:18:06.224762 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv" Feb 23 00:18:06 crc kubenswrapper[4735]: I0223 00:18:06.229479 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 23 00:18:06 crc kubenswrapper[4735]: I0223 00:18:06.233741 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv"] Feb 23 00:18:06 crc kubenswrapper[4735]: I0223 00:18:06.271394 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac06d817-3d30-4d1b-aa9c-bcff267ad35c-bundle\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv\" (UID: \"ac06d817-3d30-4d1b-aa9c-bcff267ad35c\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv" Feb 23 00:18:06 crc kubenswrapper[4735]: I0223 00:18:06.271711 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-727w7\" (UniqueName: \"kubernetes.io/projected/ac06d817-3d30-4d1b-aa9c-bcff267ad35c-kube-api-access-727w7\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv\" (UID: \"ac06d817-3d30-4d1b-aa9c-bcff267ad35c\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv" Feb 23 00:18:06 crc kubenswrapper[4735]: I0223 00:18:06.271836 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac06d817-3d30-4d1b-aa9c-bcff267ad35c-util\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv\" (UID: \"ac06d817-3d30-4d1b-aa9c-bcff267ad35c\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv" Feb 23 00:18:06 crc kubenswrapper[4735]: I0223 00:18:06.373150 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-727w7\" (UniqueName: \"kubernetes.io/projected/ac06d817-3d30-4d1b-aa9c-bcff267ad35c-kube-api-access-727w7\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv\" (UID: \"ac06d817-3d30-4d1b-aa9c-bcff267ad35c\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv" Feb 23 00:18:06 crc kubenswrapper[4735]: I0223 00:18:06.373251 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac06d817-3d30-4d1b-aa9c-bcff267ad35c-util\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv\" (UID: \"ac06d817-3d30-4d1b-aa9c-bcff267ad35c\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv" Feb 23 00:18:06 crc kubenswrapper[4735]: I0223 00:18:06.374056 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac06d817-3d30-4d1b-aa9c-bcff267ad35c-bundle\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv\" (UID: \"ac06d817-3d30-4d1b-aa9c-bcff267ad35c\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv" Feb 23 00:18:06 crc kubenswrapper[4735]: I0223 00:18:06.374104 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac06d817-3d30-4d1b-aa9c-bcff267ad35c-util\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv\" (UID: \"ac06d817-3d30-4d1b-aa9c-bcff267ad35c\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv" Feb 23 00:18:06 crc kubenswrapper[4735]: I0223 00:18:06.374365 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac06d817-3d30-4d1b-aa9c-bcff267ad35c-bundle\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv\" (UID: \"ac06d817-3d30-4d1b-aa9c-bcff267ad35c\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv" Feb 23 00:18:06 crc kubenswrapper[4735]: I0223 00:18:06.397485 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-727w7\" (UniqueName: \"kubernetes.io/projected/ac06d817-3d30-4d1b-aa9c-bcff267ad35c-kube-api-access-727w7\") pod \"00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv\" (UID: \"ac06d817-3d30-4d1b-aa9c-bcff267ad35c\") " pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv" Feb 23 00:18:06 crc kubenswrapper[4735]: I0223 00:18:06.595617 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv" Feb 23 00:18:06 crc kubenswrapper[4735]: I0223 00:18:06.857512 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv"] Feb 23 00:18:06 crc kubenswrapper[4735]: W0223 00:18:06.859084 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac06d817_3d30_4d1b_aa9c_bcff267ad35c.slice/crio-dc5b75b2415ce532ba2dc80743190cf433828076e5b9ebed67ea007b43e18859 WatchSource:0}: Error finding container dc5b75b2415ce532ba2dc80743190cf433828076e5b9ebed67ea007b43e18859: Status 404 returned error can't find the container with id dc5b75b2415ce532ba2dc80743190cf433828076e5b9ebed67ea007b43e18859 Feb 23 00:18:07 crc kubenswrapper[4735]: I0223 00:18:07.783450 4735 generic.go:334] "Generic (PLEG): container finished" podID="ac06d817-3d30-4d1b-aa9c-bcff267ad35c" containerID="2c30147da85458eab5e0e83d806f79cc7845cf0d409c4b3458f90db05792c38a" exitCode=0 Feb 23 00:18:07 crc kubenswrapper[4735]: I0223 00:18:07.783822 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv" event={"ID":"ac06d817-3d30-4d1b-aa9c-bcff267ad35c","Type":"ContainerDied","Data":"2c30147da85458eab5e0e83d806f79cc7845cf0d409c4b3458f90db05792c38a"} Feb 23 00:18:07 crc kubenswrapper[4735]: I0223 00:18:07.783890 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv" event={"ID":"ac06d817-3d30-4d1b-aa9c-bcff267ad35c","Type":"ContainerStarted","Data":"dc5b75b2415ce532ba2dc80743190cf433828076e5b9ebed67ea007b43e18859"} Feb 23 00:18:08 crc kubenswrapper[4735]: I0223 00:18:08.793233 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv" event={"ID":"ac06d817-3d30-4d1b-aa9c-bcff267ad35c","Type":"ContainerStarted","Data":"87220c101e7c171bb19d866260be9c2d734c96dd9cf885bfb969faa957a7158e"} Feb 23 00:18:09 crc kubenswrapper[4735]: I0223 00:18:09.801269 4735 generic.go:334] "Generic (PLEG): container finished" podID="ac06d817-3d30-4d1b-aa9c-bcff267ad35c" containerID="87220c101e7c171bb19d866260be9c2d734c96dd9cf885bfb969faa957a7158e" exitCode=0 Feb 23 00:18:09 crc kubenswrapper[4735]: I0223 00:18:09.801323 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv" event={"ID":"ac06d817-3d30-4d1b-aa9c-bcff267ad35c","Type":"ContainerDied","Data":"87220c101e7c171bb19d866260be9c2d734c96dd9cf885bfb969faa957a7158e"} Feb 23 00:18:10 crc kubenswrapper[4735]: I0223 00:18:10.425472 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz"] Feb 23 00:18:10 crc kubenswrapper[4735]: I0223 00:18:10.426409 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz" Feb 23 00:18:10 crc kubenswrapper[4735]: I0223 00:18:10.454307 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz"] Feb 23 00:18:10 crc kubenswrapper[4735]: I0223 00:18:10.529695 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz\" (UID: \"84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz" Feb 23 00:18:10 crc kubenswrapper[4735]: I0223 00:18:10.529779 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz\" (UID: \"84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz" Feb 23 00:18:10 crc kubenswrapper[4735]: I0223 00:18:10.529908 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgw84\" (UniqueName: \"kubernetes.io/projected/84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684-kube-api-access-zgw84\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz\" (UID: \"84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz" Feb 23 00:18:10 crc kubenswrapper[4735]: I0223 00:18:10.630916 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz\" (UID: \"84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz" Feb 23 00:18:10 crc kubenswrapper[4735]: I0223 00:18:10.630978 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgw84\" (UniqueName: \"kubernetes.io/projected/84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684-kube-api-access-zgw84\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz\" (UID: \"84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz" Feb 23 00:18:10 crc kubenswrapper[4735]: I0223 00:18:10.631021 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz\" (UID: \"84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz" Feb 23 00:18:10 crc kubenswrapper[4735]: I0223 00:18:10.631532 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz\" (UID: \"84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz" Feb 23 00:18:10 crc kubenswrapper[4735]: I0223 00:18:10.631554 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz\" (UID: \"84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz" Feb 23 00:18:10 crc kubenswrapper[4735]: I0223 00:18:10.662047 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgw84\" (UniqueName: \"kubernetes.io/projected/84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684-kube-api-access-zgw84\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz\" (UID: \"84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz" Feb 23 00:18:10 crc kubenswrapper[4735]: I0223 00:18:10.738723 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz" Feb 23 00:18:10 crc kubenswrapper[4735]: I0223 00:18:10.817199 4735 generic.go:334] "Generic (PLEG): container finished" podID="ac06d817-3d30-4d1b-aa9c-bcff267ad35c" containerID="f5abac762ec347001ca17f63c143ab846be7ec8bdfc75e1117da25f4360173d0" exitCode=0 Feb 23 00:18:10 crc kubenswrapper[4735]: I0223 00:18:10.817260 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv" event={"ID":"ac06d817-3d30-4d1b-aa9c-bcff267ad35c","Type":"ContainerDied","Data":"f5abac762ec347001ca17f63c143ab846be7ec8bdfc75e1117da25f4360173d0"} Feb 23 00:18:11 crc kubenswrapper[4735]: I0223 00:18:11.363615 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz"] Feb 23 00:18:11 crc kubenswrapper[4735]: I0223 00:18:11.822588 4735 generic.go:334] "Generic (PLEG): container finished" podID="84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684" containerID="d4207661549968a62241a684772412434cf8de6c5f562ec77062f7ef4858dda5" exitCode=0 Feb 23 00:18:11 crc kubenswrapper[4735]: I0223 00:18:11.822757 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz" event={"ID":"84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684","Type":"ContainerDied","Data":"d4207661549968a62241a684772412434cf8de6c5f562ec77062f7ef4858dda5"} Feb 23 00:18:11 crc kubenswrapper[4735]: I0223 00:18:11.823657 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz" event={"ID":"84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684","Type":"ContainerStarted","Data":"7b16eb7662d9d57abb156c9ae4468f650040a577a689e8d9b9cf4787e3bf367f"} Feb 23 00:18:12 crc kubenswrapper[4735]: I0223 00:18:12.098843 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv" Feb 23 00:18:12 crc kubenswrapper[4735]: I0223 00:18:12.277747 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac06d817-3d30-4d1b-aa9c-bcff267ad35c-util\") pod \"ac06d817-3d30-4d1b-aa9c-bcff267ad35c\" (UID: \"ac06d817-3d30-4d1b-aa9c-bcff267ad35c\") " Feb 23 00:18:12 crc kubenswrapper[4735]: I0223 00:18:12.277816 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-727w7\" (UniqueName: \"kubernetes.io/projected/ac06d817-3d30-4d1b-aa9c-bcff267ad35c-kube-api-access-727w7\") pod \"ac06d817-3d30-4d1b-aa9c-bcff267ad35c\" (UID: \"ac06d817-3d30-4d1b-aa9c-bcff267ad35c\") " Feb 23 00:18:12 crc kubenswrapper[4735]: I0223 00:18:12.277867 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac06d817-3d30-4d1b-aa9c-bcff267ad35c-bundle\") pod \"ac06d817-3d30-4d1b-aa9c-bcff267ad35c\" (UID: \"ac06d817-3d30-4d1b-aa9c-bcff267ad35c\") " Feb 23 00:18:12 crc kubenswrapper[4735]: I0223 00:18:12.284300 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac06d817-3d30-4d1b-aa9c-bcff267ad35c-kube-api-access-727w7" (OuterVolumeSpecName: "kube-api-access-727w7") pod "ac06d817-3d30-4d1b-aa9c-bcff267ad35c" (UID: "ac06d817-3d30-4d1b-aa9c-bcff267ad35c"). InnerVolumeSpecName "kube-api-access-727w7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:18:12 crc kubenswrapper[4735]: I0223 00:18:12.290171 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac06d817-3d30-4d1b-aa9c-bcff267ad35c-bundle" (OuterVolumeSpecName: "bundle") pod "ac06d817-3d30-4d1b-aa9c-bcff267ad35c" (UID: "ac06d817-3d30-4d1b-aa9c-bcff267ad35c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:18:12 crc kubenswrapper[4735]: I0223 00:18:12.379382 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-727w7\" (UniqueName: \"kubernetes.io/projected/ac06d817-3d30-4d1b-aa9c-bcff267ad35c-kube-api-access-727w7\") on node \"crc\" DevicePath \"\"" Feb 23 00:18:12 crc kubenswrapper[4735]: I0223 00:18:12.379413 4735 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac06d817-3d30-4d1b-aa9c-bcff267ad35c-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 00:18:12 crc kubenswrapper[4735]: I0223 00:18:12.446349 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac06d817-3d30-4d1b-aa9c-bcff267ad35c-util" (OuterVolumeSpecName: "util") pod "ac06d817-3d30-4d1b-aa9c-bcff267ad35c" (UID: "ac06d817-3d30-4d1b-aa9c-bcff267ad35c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:18:12 crc kubenswrapper[4735]: I0223 00:18:12.480957 4735 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac06d817-3d30-4d1b-aa9c-bcff267ad35c-util\") on node \"crc\" DevicePath \"\"" Feb 23 00:18:12 crc kubenswrapper[4735]: I0223 00:18:12.830456 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv" event={"ID":"ac06d817-3d30-4d1b-aa9c-bcff267ad35c","Type":"ContainerDied","Data":"dc5b75b2415ce532ba2dc80743190cf433828076e5b9ebed67ea007b43e18859"} Feb 23 00:18:12 crc kubenswrapper[4735]: I0223 00:18:12.830504 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc5b75b2415ce532ba2dc80743190cf433828076e5b9ebed67ea007b43e18859" Feb 23 00:18:12 crc kubenswrapper[4735]: I0223 00:18:12.830579 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv" Feb 23 00:18:15 crc kubenswrapper[4735]: I0223 00:18:15.845734 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz" event={"ID":"84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684","Type":"ContainerStarted","Data":"681bf441078c2fd0090aaed3607e521f3e33c125a50344fe74dc5b32c6167cd2"} Feb 23 00:18:16 crc kubenswrapper[4735]: I0223 00:18:16.852285 4735 generic.go:334] "Generic (PLEG): container finished" podID="84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684" containerID="681bf441078c2fd0090aaed3607e521f3e33c125a50344fe74dc5b32c6167cd2" exitCode=0 Feb 23 00:18:16 crc kubenswrapper[4735]: I0223 00:18:16.852347 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz" event={"ID":"84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684","Type":"ContainerDied","Data":"681bf441078c2fd0090aaed3607e521f3e33c125a50344fe74dc5b32c6167cd2"} Feb 23 00:18:17 crc kubenswrapper[4735]: I0223 00:18:17.864064 4735 generic.go:334] "Generic (PLEG): container finished" podID="84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684" containerID="4bea1075a419929db9523692ee32bf5fa9e8f56effd7a0101fc5300581a58089" exitCode=0 Feb 23 00:18:17 crc kubenswrapper[4735]: I0223 00:18:17.864158 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz" event={"ID":"84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684","Type":"ContainerDied","Data":"4bea1075a419929db9523692ee32bf5fa9e8f56effd7a0101fc5300581a58089"} Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.513744 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-t22rv"] Feb 23 00:18:18 crc kubenswrapper[4735]: E0223 00:18:18.514282 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac06d817-3d30-4d1b-aa9c-bcff267ad35c" containerName="pull" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.514294 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac06d817-3d30-4d1b-aa9c-bcff267ad35c" containerName="pull" Feb 23 00:18:18 crc kubenswrapper[4735]: E0223 00:18:18.514308 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac06d817-3d30-4d1b-aa9c-bcff267ad35c" containerName="util" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.514317 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac06d817-3d30-4d1b-aa9c-bcff267ad35c" containerName="util" Feb 23 00:18:18 crc kubenswrapper[4735]: E0223 00:18:18.514327 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac06d817-3d30-4d1b-aa9c-bcff267ad35c" containerName="extract" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.514334 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac06d817-3d30-4d1b-aa9c-bcff267ad35c" containerName="extract" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.514451 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac06d817-3d30-4d1b-aa9c-bcff267ad35c" containerName="extract" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.514900 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-t22rv" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.516742 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.518075 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.518833 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-jhpbs" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.569160 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-t22rv"] Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.622843 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcgfj\" (UniqueName: \"kubernetes.io/projected/1e169620-1b41-4184-9268-ff74d8f3e1a5-kube-api-access-zcgfj\") pod \"obo-prometheus-operator-68bc856cb9-t22rv\" (UID: \"1e169620-1b41-4184-9268-ff74d8f3e1a5\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-t22rv" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.633077 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7d8ddb755b-qzw7j"] Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.633704 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d8ddb755b-qzw7j" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.635479 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-fmsb5" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.637450 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.646538 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7d8ddb755b-zz6rp"] Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.647231 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d8ddb755b-zz6rp" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.662514 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7d8ddb755b-zz6rp"] Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.696731 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7d8ddb755b-qzw7j"] Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.723427 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcgfj\" (UniqueName: \"kubernetes.io/projected/1e169620-1b41-4184-9268-ff74d8f3e1a5-kube-api-access-zcgfj\") pod \"obo-prometheus-operator-68bc856cb9-t22rv\" (UID: \"1e169620-1b41-4184-9268-ff74d8f3e1a5\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-t22rv" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.723473 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7a5bb4ed-d256-4bcf-aae6-1959a199d920-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7d8ddb755b-zz6rp\" (UID: \"7a5bb4ed-d256-4bcf-aae6-1959a199d920\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d8ddb755b-zz6rp" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.723502 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7a5bb4ed-d256-4bcf-aae6-1959a199d920-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7d8ddb755b-zz6rp\" (UID: \"7a5bb4ed-d256-4bcf-aae6-1959a199d920\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d8ddb755b-zz6rp" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.723547 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8dc28938-196d-418f-8b9e-9e41eca4ee56-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7d8ddb755b-qzw7j\" (UID: \"8dc28938-196d-418f-8b9e-9e41eca4ee56\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d8ddb755b-qzw7j" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.723570 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8dc28938-196d-418f-8b9e-9e41eca4ee56-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7d8ddb755b-qzw7j\" (UID: \"8dc28938-196d-418f-8b9e-9e41eca4ee56\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d8ddb755b-qzw7j" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.749044 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-8s6qj"] Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.749663 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-8s6qj" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.750262 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcgfj\" (UniqueName: \"kubernetes.io/projected/1e169620-1b41-4184-9268-ff74d8f3e1a5-kube-api-access-zcgfj\") pod \"obo-prometheus-operator-68bc856cb9-t22rv\" (UID: \"1e169620-1b41-4184-9268-ff74d8f3e1a5\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-t22rv" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.751531 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.751785 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-d4ftf" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.771132 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-8s6qj"] Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.824266 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8dc28938-196d-418f-8b9e-9e41eca4ee56-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7d8ddb755b-qzw7j\" (UID: \"8dc28938-196d-418f-8b9e-9e41eca4ee56\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d8ddb755b-qzw7j" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.824325 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8dc28938-196d-418f-8b9e-9e41eca4ee56-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7d8ddb755b-qzw7j\" (UID: \"8dc28938-196d-418f-8b9e-9e41eca4ee56\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d8ddb755b-qzw7j" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.824373 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6gwp\" (UniqueName: \"kubernetes.io/projected/bf421747-d273-4d48-bc0c-dd2947ac646a-kube-api-access-j6gwp\") pod \"observability-operator-59bdc8b94-8s6qj\" (UID: \"bf421747-d273-4d48-bc0c-dd2947ac646a\") " pod="openshift-operators/observability-operator-59bdc8b94-8s6qj" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.824424 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7a5bb4ed-d256-4bcf-aae6-1959a199d920-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7d8ddb755b-zz6rp\" (UID: \"7a5bb4ed-d256-4bcf-aae6-1959a199d920\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d8ddb755b-zz6rp" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.824458 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7a5bb4ed-d256-4bcf-aae6-1959a199d920-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7d8ddb755b-zz6rp\" (UID: \"7a5bb4ed-d256-4bcf-aae6-1959a199d920\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d8ddb755b-zz6rp" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.824476 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf421747-d273-4d48-bc0c-dd2947ac646a-observability-operator-tls\") pod \"observability-operator-59bdc8b94-8s6qj\" (UID: \"bf421747-d273-4d48-bc0c-dd2947ac646a\") " pod="openshift-operators/observability-operator-59bdc8b94-8s6qj" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.831035 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-t22rv" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.839461 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7a5bb4ed-d256-4bcf-aae6-1959a199d920-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7d8ddb755b-zz6rp\" (UID: \"7a5bb4ed-d256-4bcf-aae6-1959a199d920\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d8ddb755b-zz6rp" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.839975 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7a5bb4ed-d256-4bcf-aae6-1959a199d920-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7d8ddb755b-zz6rp\" (UID: \"7a5bb4ed-d256-4bcf-aae6-1959a199d920\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d8ddb755b-zz6rp" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.842228 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8dc28938-196d-418f-8b9e-9e41eca4ee56-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7d8ddb755b-qzw7j\" (UID: \"8dc28938-196d-418f-8b9e-9e41eca4ee56\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d8ddb755b-qzw7j" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.842809 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8dc28938-196d-418f-8b9e-9e41eca4ee56-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7d8ddb755b-qzw7j\" (UID: \"8dc28938-196d-418f-8b9e-9e41eca4ee56\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d8ddb755b-qzw7j" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.925631 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6gwp\" (UniqueName: \"kubernetes.io/projected/bf421747-d273-4d48-bc0c-dd2947ac646a-kube-api-access-j6gwp\") pod \"observability-operator-59bdc8b94-8s6qj\" (UID: \"bf421747-d273-4d48-bc0c-dd2947ac646a\") " pod="openshift-operators/observability-operator-59bdc8b94-8s6qj" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.925726 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf421747-d273-4d48-bc0c-dd2947ac646a-observability-operator-tls\") pod \"observability-operator-59bdc8b94-8s6qj\" (UID: \"bf421747-d273-4d48-bc0c-dd2947ac646a\") " pod="openshift-operators/observability-operator-59bdc8b94-8s6qj" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.947673 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf421747-d273-4d48-bc0c-dd2947ac646a-observability-operator-tls\") pod \"observability-operator-59bdc8b94-8s6qj\" (UID: \"bf421747-d273-4d48-bc0c-dd2947ac646a\") " pod="openshift-operators/observability-operator-59bdc8b94-8s6qj" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.947946 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d8ddb755b-qzw7j" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.949846 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6gwp\" (UniqueName: \"kubernetes.io/projected/bf421747-d273-4d48-bc0c-dd2947ac646a-kube-api-access-j6gwp\") pod \"observability-operator-59bdc8b94-8s6qj\" (UID: \"bf421747-d273-4d48-bc0c-dd2947ac646a\") " pod="openshift-operators/observability-operator-59bdc8b94-8s6qj" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.960099 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d8ddb755b-zz6rp" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.969072 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-vm7rb"] Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.970762 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-vm7rb" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.975803 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-lsl9g" Feb 23 00:18:18 crc kubenswrapper[4735]: I0223 00:18:18.997388 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-vm7rb"] Feb 23 00:18:19 crc kubenswrapper[4735]: I0223 00:18:19.026530 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47dmd\" (UniqueName: \"kubernetes.io/projected/dc58bb3c-27f0-4384-836e-caf92997ba93-kube-api-access-47dmd\") pod \"perses-operator-5bf474d74f-vm7rb\" (UID: \"dc58bb3c-27f0-4384-836e-caf92997ba93\") " pod="openshift-operators/perses-operator-5bf474d74f-vm7rb" Feb 23 00:18:19 crc kubenswrapper[4735]: I0223 00:18:19.026595 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/dc58bb3c-27f0-4384-836e-caf92997ba93-openshift-service-ca\") pod \"perses-operator-5bf474d74f-vm7rb\" (UID: \"dc58bb3c-27f0-4384-836e-caf92997ba93\") " pod="openshift-operators/perses-operator-5bf474d74f-vm7rb" Feb 23 00:18:19 crc kubenswrapper[4735]: I0223 00:18:19.083227 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-8s6qj" Feb 23 00:18:19 crc kubenswrapper[4735]: I0223 00:18:19.127067 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47dmd\" (UniqueName: \"kubernetes.io/projected/dc58bb3c-27f0-4384-836e-caf92997ba93-kube-api-access-47dmd\") pod \"perses-operator-5bf474d74f-vm7rb\" (UID: \"dc58bb3c-27f0-4384-836e-caf92997ba93\") " pod="openshift-operators/perses-operator-5bf474d74f-vm7rb" Feb 23 00:18:19 crc kubenswrapper[4735]: I0223 00:18:19.127118 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/dc58bb3c-27f0-4384-836e-caf92997ba93-openshift-service-ca\") pod \"perses-operator-5bf474d74f-vm7rb\" (UID: \"dc58bb3c-27f0-4384-836e-caf92997ba93\") " pod="openshift-operators/perses-operator-5bf474d74f-vm7rb" Feb 23 00:18:19 crc kubenswrapper[4735]: I0223 00:18:19.129075 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/dc58bb3c-27f0-4384-836e-caf92997ba93-openshift-service-ca\") pod \"perses-operator-5bf474d74f-vm7rb\" (UID: \"dc58bb3c-27f0-4384-836e-caf92997ba93\") " pod="openshift-operators/perses-operator-5bf474d74f-vm7rb" Feb 23 00:18:19 crc kubenswrapper[4735]: I0223 00:18:19.150600 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47dmd\" (UniqueName: \"kubernetes.io/projected/dc58bb3c-27f0-4384-836e-caf92997ba93-kube-api-access-47dmd\") pod \"perses-operator-5bf474d74f-vm7rb\" (UID: \"dc58bb3c-27f0-4384-836e-caf92997ba93\") " pod="openshift-operators/perses-operator-5bf474d74f-vm7rb" Feb 23 00:18:19 crc kubenswrapper[4735]: I0223 00:18:19.285187 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-vm7rb" Feb 23 00:18:19 crc kubenswrapper[4735]: I0223 00:18:19.361942 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz" Feb 23 00:18:19 crc kubenswrapper[4735]: I0223 00:18:19.436439 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684-bundle\") pod \"84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684\" (UID: \"84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684\") " Feb 23 00:18:19 crc kubenswrapper[4735]: I0223 00:18:19.436504 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684-util\") pod \"84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684\" (UID: \"84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684\") " Feb 23 00:18:19 crc kubenswrapper[4735]: I0223 00:18:19.436561 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgw84\" (UniqueName: \"kubernetes.io/projected/84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684-kube-api-access-zgw84\") pod \"84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684\" (UID: \"84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684\") " Feb 23 00:18:19 crc kubenswrapper[4735]: I0223 00:18:19.438380 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684-bundle" (OuterVolumeSpecName: "bundle") pod "84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684" (UID: "84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:18:19 crc kubenswrapper[4735]: I0223 00:18:19.439710 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684-kube-api-access-zgw84" (OuterVolumeSpecName: "kube-api-access-zgw84") pod "84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684" (UID: "84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684"). InnerVolumeSpecName "kube-api-access-zgw84". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:18:19 crc kubenswrapper[4735]: I0223 00:18:19.451752 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684-util" (OuterVolumeSpecName: "util") pod "84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684" (UID: "84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:18:19 crc kubenswrapper[4735]: I0223 00:18:19.510989 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-t22rv"] Feb 23 00:18:19 crc kubenswrapper[4735]: W0223 00:18:19.518295 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e169620_1b41_4184_9268_ff74d8f3e1a5.slice/crio-9de00ad33b363246faaa7c2f1cc77d3bc2d9b23a91b7ea80c845d46c3589d406 WatchSource:0}: Error finding container 9de00ad33b363246faaa7c2f1cc77d3bc2d9b23a91b7ea80c845d46c3589d406: Status 404 returned error can't find the container with id 9de00ad33b363246faaa7c2f1cc77d3bc2d9b23a91b7ea80c845d46c3589d406 Feb 23 00:18:19 crc kubenswrapper[4735]: I0223 00:18:19.538081 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgw84\" (UniqueName: \"kubernetes.io/projected/84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684-kube-api-access-zgw84\") on node \"crc\" DevicePath \"\"" Feb 23 00:18:19 crc kubenswrapper[4735]: I0223 00:18:19.538132 4735 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 00:18:19 crc kubenswrapper[4735]: I0223 00:18:19.538149 4735 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684-util\") on node \"crc\" DevicePath \"\"" Feb 23 00:18:19 crc kubenswrapper[4735]: I0223 00:18:19.559272 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7d8ddb755b-qzw7j"] Feb 23 00:18:19 crc kubenswrapper[4735]: W0223 00:18:19.571956 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8dc28938_196d_418f_8b9e_9e41eca4ee56.slice/crio-9e0807e519462017d0733bf7affb7bfcbd329bbd38f40c885b932f7fe8eca2e1 WatchSource:0}: Error finding container 9e0807e519462017d0733bf7affb7bfcbd329bbd38f40c885b932f7fe8eca2e1: Status 404 returned error can't find the container with id 9e0807e519462017d0733bf7affb7bfcbd329bbd38f40c885b932f7fe8eca2e1 Feb 23 00:18:19 crc kubenswrapper[4735]: I0223 00:18:19.572510 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7d8ddb755b-zz6rp"] Feb 23 00:18:19 crc kubenswrapper[4735]: I0223 00:18:19.661680 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-8s6qj"] Feb 23 00:18:19 crc kubenswrapper[4735]: W0223 00:18:19.667456 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf421747_d273_4d48_bc0c_dd2947ac646a.slice/crio-e4aac3d2442634278f0bd7adb571f00c57d026b68759d827c567ab45eed45e61 WatchSource:0}: Error finding container e4aac3d2442634278f0bd7adb571f00c57d026b68759d827c567ab45eed45e61: Status 404 returned error can't find the container with id e4aac3d2442634278f0bd7adb571f00c57d026b68759d827c567ab45eed45e61 Feb 23 00:18:19 crc kubenswrapper[4735]: I0223 00:18:19.708410 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-vm7rb"] Feb 23 00:18:19 crc kubenswrapper[4735]: W0223 00:18:19.715841 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc58bb3c_27f0_4384_836e_caf92997ba93.slice/crio-b74055744f4e4af8dbcade8cb8a99d70da2ae8e3770806353b7d0ecb7e929980 WatchSource:0}: Error finding container b74055744f4e4af8dbcade8cb8a99d70da2ae8e3770806353b7d0ecb7e929980: Status 404 returned error can't find the container with id b74055744f4e4af8dbcade8cb8a99d70da2ae8e3770806353b7d0ecb7e929980 Feb 23 00:18:19 crc kubenswrapper[4735]: I0223 00:18:19.900276 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz" event={"ID":"84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684","Type":"ContainerDied","Data":"7b16eb7662d9d57abb156c9ae4468f650040a577a689e8d9b9cf4787e3bf367f"} Feb 23 00:18:19 crc kubenswrapper[4735]: I0223 00:18:19.900323 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b16eb7662d9d57abb156c9ae4468f650040a577a689e8d9b9cf4787e3bf367f" Feb 23 00:18:19 crc kubenswrapper[4735]: I0223 00:18:19.900390 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz" Feb 23 00:18:19 crc kubenswrapper[4735]: I0223 00:18:19.904063 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d8ddb755b-zz6rp" event={"ID":"7a5bb4ed-d256-4bcf-aae6-1959a199d920","Type":"ContainerStarted","Data":"c5e93df51ca48db072f1ecde315d7aac4922628a1fdee4979f21b8bc2820ca7f"} Feb 23 00:18:19 crc kubenswrapper[4735]: I0223 00:18:19.904986 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-8s6qj" event={"ID":"bf421747-d273-4d48-bc0c-dd2947ac646a","Type":"ContainerStarted","Data":"e4aac3d2442634278f0bd7adb571f00c57d026b68759d827c567ab45eed45e61"} Feb 23 00:18:19 crc kubenswrapper[4735]: I0223 00:18:19.905688 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d8ddb755b-qzw7j" event={"ID":"8dc28938-196d-418f-8b9e-9e41eca4ee56","Type":"ContainerStarted","Data":"9e0807e519462017d0733bf7affb7bfcbd329bbd38f40c885b932f7fe8eca2e1"} Feb 23 00:18:19 crc kubenswrapper[4735]: I0223 00:18:19.906872 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-vm7rb" event={"ID":"dc58bb3c-27f0-4384-836e-caf92997ba93","Type":"ContainerStarted","Data":"b74055744f4e4af8dbcade8cb8a99d70da2ae8e3770806353b7d0ecb7e929980"} Feb 23 00:18:19 crc kubenswrapper[4735]: I0223 00:18:19.907779 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-t22rv" event={"ID":"1e169620-1b41-4184-9268-ff74d8f3e1a5","Type":"ContainerStarted","Data":"9de00ad33b363246faaa7c2f1cc77d3bc2d9b23a91b7ea80c845d46c3589d406"} Feb 23 00:18:20 crc kubenswrapper[4735]: I0223 00:18:20.754070 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elastic-operator-688778444b-7lngw"] Feb 23 00:18:20 crc kubenswrapper[4735]: E0223 00:18:20.763140 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684" containerName="util" Feb 23 00:18:20 crc kubenswrapper[4735]: I0223 00:18:20.763171 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684" containerName="util" Feb 23 00:18:20 crc kubenswrapper[4735]: E0223 00:18:20.763183 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684" containerName="pull" Feb 23 00:18:20 crc kubenswrapper[4735]: I0223 00:18:20.763189 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684" containerName="pull" Feb 23 00:18:20 crc kubenswrapper[4735]: E0223 00:18:20.763204 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684" containerName="extract" Feb 23 00:18:20 crc kubenswrapper[4735]: I0223 00:18:20.763209 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684" containerName="extract" Feb 23 00:18:20 crc kubenswrapper[4735]: I0223 00:18:20.763294 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684" containerName="extract" Feb 23 00:18:20 crc kubenswrapper[4735]: I0223 00:18:20.763736 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-688778444b-7lngw" Feb 23 00:18:20 crc kubenswrapper[4735]: I0223 00:18:20.772267 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-dockercfg-xd66c" Feb 23 00:18:20 crc kubenswrapper[4735]: I0223 00:18:20.772552 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"openshift-service-ca.crt" Feb 23 00:18:20 crc kubenswrapper[4735]: I0223 00:18:20.772979 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"kube-root-ca.crt" Feb 23 00:18:20 crc kubenswrapper[4735]: I0223 00:18:20.777205 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-service-cert" Feb 23 00:18:20 crc kubenswrapper[4735]: I0223 00:18:20.777484 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-688778444b-7lngw"] Feb 23 00:18:20 crc kubenswrapper[4735]: I0223 00:18:20.851794 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9e14c74b-1d96-41df-83e9-aea090f66d5f-webhook-cert\") pod \"elastic-operator-688778444b-7lngw\" (UID: \"9e14c74b-1d96-41df-83e9-aea090f66d5f\") " pod="service-telemetry/elastic-operator-688778444b-7lngw" Feb 23 00:18:20 crc kubenswrapper[4735]: I0223 00:18:20.852132 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9e14c74b-1d96-41df-83e9-aea090f66d5f-apiservice-cert\") pod \"elastic-operator-688778444b-7lngw\" (UID: \"9e14c74b-1d96-41df-83e9-aea090f66d5f\") " pod="service-telemetry/elastic-operator-688778444b-7lngw" Feb 23 00:18:20 crc kubenswrapper[4735]: I0223 00:18:20.852153 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5vsd\" (UniqueName: \"kubernetes.io/projected/9e14c74b-1d96-41df-83e9-aea090f66d5f-kube-api-access-c5vsd\") pod \"elastic-operator-688778444b-7lngw\" (UID: \"9e14c74b-1d96-41df-83e9-aea090f66d5f\") " pod="service-telemetry/elastic-operator-688778444b-7lngw" Feb 23 00:18:20 crc kubenswrapper[4735]: I0223 00:18:20.953449 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9e14c74b-1d96-41df-83e9-aea090f66d5f-webhook-cert\") pod \"elastic-operator-688778444b-7lngw\" (UID: \"9e14c74b-1d96-41df-83e9-aea090f66d5f\") " pod="service-telemetry/elastic-operator-688778444b-7lngw" Feb 23 00:18:20 crc kubenswrapper[4735]: I0223 00:18:20.953534 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9e14c74b-1d96-41df-83e9-aea090f66d5f-apiservice-cert\") pod \"elastic-operator-688778444b-7lngw\" (UID: \"9e14c74b-1d96-41df-83e9-aea090f66d5f\") " pod="service-telemetry/elastic-operator-688778444b-7lngw" Feb 23 00:18:20 crc kubenswrapper[4735]: I0223 00:18:20.953562 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5vsd\" (UniqueName: \"kubernetes.io/projected/9e14c74b-1d96-41df-83e9-aea090f66d5f-kube-api-access-c5vsd\") pod \"elastic-operator-688778444b-7lngw\" (UID: \"9e14c74b-1d96-41df-83e9-aea090f66d5f\") " pod="service-telemetry/elastic-operator-688778444b-7lngw" Feb 23 00:18:20 crc kubenswrapper[4735]: I0223 00:18:20.961757 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9e14c74b-1d96-41df-83e9-aea090f66d5f-apiservice-cert\") pod \"elastic-operator-688778444b-7lngw\" (UID: \"9e14c74b-1d96-41df-83e9-aea090f66d5f\") " pod="service-telemetry/elastic-operator-688778444b-7lngw" Feb 23 00:18:20 crc kubenswrapper[4735]: I0223 00:18:20.972454 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5vsd\" (UniqueName: \"kubernetes.io/projected/9e14c74b-1d96-41df-83e9-aea090f66d5f-kube-api-access-c5vsd\") pod \"elastic-operator-688778444b-7lngw\" (UID: \"9e14c74b-1d96-41df-83e9-aea090f66d5f\") " pod="service-telemetry/elastic-operator-688778444b-7lngw" Feb 23 00:18:20 crc kubenswrapper[4735]: I0223 00:18:20.972549 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9e14c74b-1d96-41df-83e9-aea090f66d5f-webhook-cert\") pod \"elastic-operator-688778444b-7lngw\" (UID: \"9e14c74b-1d96-41df-83e9-aea090f66d5f\") " pod="service-telemetry/elastic-operator-688778444b-7lngw" Feb 23 00:18:21 crc kubenswrapper[4735]: I0223 00:18:21.093244 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-688778444b-7lngw" Feb 23 00:18:21 crc kubenswrapper[4735]: I0223 00:18:21.367762 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-688778444b-7lngw"] Feb 23 00:18:21 crc kubenswrapper[4735]: I0223 00:18:21.926111 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-688778444b-7lngw" event={"ID":"9e14c74b-1d96-41df-83e9-aea090f66d5f","Type":"ContainerStarted","Data":"8a724e58ab8777bfa897b7f6dcc866f88d1c3bbd777d06fb73d8e702a243e1aa"} Feb 23 00:18:32 crc kubenswrapper[4735]: I0223 00:18:32.034716 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d8ddb755b-zz6rp" event={"ID":"7a5bb4ed-d256-4bcf-aae6-1959a199d920","Type":"ContainerStarted","Data":"85169c316758d694d9f4fa62eb5b2a76314bf2bcf7536baba5c5589659f35e1c"} Feb 23 00:18:32 crc kubenswrapper[4735]: I0223 00:18:32.037045 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-8s6qj" event={"ID":"bf421747-d273-4d48-bc0c-dd2947ac646a","Type":"ContainerStarted","Data":"18b06805ea7f2aaf5fb3501c610f6db0a556cb4f3e949432e46de24fa157e62b"} Feb 23 00:18:32 crc kubenswrapper[4735]: I0223 00:18:32.037087 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-8s6qj" Feb 23 00:18:32 crc kubenswrapper[4735]: I0223 00:18:32.038913 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d8ddb755b-qzw7j" event={"ID":"8dc28938-196d-418f-8b9e-9e41eca4ee56","Type":"ContainerStarted","Data":"7dcc1f5658d09098b8ee3c36c31ee0a80775f21211dee941772ae0dfb25638a0"} Feb 23 00:18:32 crc kubenswrapper[4735]: I0223 00:18:32.040554 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-vm7rb" event={"ID":"dc58bb3c-27f0-4384-836e-caf92997ba93","Type":"ContainerStarted","Data":"0ebbb1afdd6e9e18feff9f0cb8f4798fb7917cb4feb3a069a8b96bca98cab1e1"} Feb 23 00:18:32 crc kubenswrapper[4735]: I0223 00:18:32.040664 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-vm7rb" Feb 23 00:18:32 crc kubenswrapper[4735]: I0223 00:18:32.042228 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-688778444b-7lngw" event={"ID":"9e14c74b-1d96-41df-83e9-aea090f66d5f","Type":"ContainerStarted","Data":"9b156ad80b4ffdac28c3a9c9f4a010e5f52615889667c5928eaa53a664b939d7"} Feb 23 00:18:32 crc kubenswrapper[4735]: I0223 00:18:32.043862 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-t22rv" event={"ID":"1e169620-1b41-4184-9268-ff74d8f3e1a5","Type":"ContainerStarted","Data":"46ca4407442b0010cc5cd0659f0185d2f10f40d304380fd860b6d4eaf8918b50"} Feb 23 00:18:32 crc kubenswrapper[4735]: I0223 00:18:32.054940 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d8ddb755b-zz6rp" podStartSLOduration=2.230370413 podStartE2EDuration="14.054914724s" podCreationTimestamp="2026-02-23 00:18:18 +0000 UTC" firstStartedPulling="2026-02-23 00:18:19.578537855 +0000 UTC m=+658.042083836" lastFinishedPulling="2026-02-23 00:18:31.403082156 +0000 UTC m=+669.866628147" observedRunningTime="2026-02-23 00:18:32.05206401 +0000 UTC m=+670.515609991" watchObservedRunningTime="2026-02-23 00:18:32.054914724 +0000 UTC m=+670.518460695" Feb 23 00:18:32 crc kubenswrapper[4735]: I0223 00:18:32.073640 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-8s6qj" Feb 23 00:18:32 crc kubenswrapper[4735]: I0223 00:18:32.102890 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-8s6qj" podStartSLOduration=2.4104897530000002 podStartE2EDuration="14.102871712s" podCreationTimestamp="2026-02-23 00:18:18 +0000 UTC" firstStartedPulling="2026-02-23 00:18:19.67036527 +0000 UTC m=+658.133911241" lastFinishedPulling="2026-02-23 00:18:31.362747229 +0000 UTC m=+669.826293200" observedRunningTime="2026-02-23 00:18:32.078256059 +0000 UTC m=+670.541802020" watchObservedRunningTime="2026-02-23 00:18:32.102871712 +0000 UTC m=+670.566417683" Feb 23 00:18:32 crc kubenswrapper[4735]: I0223 00:18:32.103889 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-vm7rb" podStartSLOduration=2.419299702 podStartE2EDuration="14.103885415s" podCreationTimestamp="2026-02-23 00:18:18 +0000 UTC" firstStartedPulling="2026-02-23 00:18:19.71881197 +0000 UTC m=+658.182357941" lastFinishedPulling="2026-02-23 00:18:31.403397683 +0000 UTC m=+669.866943654" observedRunningTime="2026-02-23 00:18:32.101531152 +0000 UTC m=+670.565077123" watchObservedRunningTime="2026-02-23 00:18:32.103885415 +0000 UTC m=+670.567431386" Feb 23 00:18:32 crc kubenswrapper[4735]: I0223 00:18:32.117989 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elastic-operator-688778444b-7lngw" podStartSLOduration=2.150175295 podStartE2EDuration="12.117974182s" podCreationTimestamp="2026-02-23 00:18:20 +0000 UTC" firstStartedPulling="2026-02-23 00:18:21.394639205 +0000 UTC m=+659.858185176" lastFinishedPulling="2026-02-23 00:18:31.362438092 +0000 UTC m=+669.825984063" observedRunningTime="2026-02-23 00:18:32.117256186 +0000 UTC m=+670.580802157" watchObservedRunningTime="2026-02-23 00:18:32.117974182 +0000 UTC m=+670.581520153" Feb 23 00:18:32 crc kubenswrapper[4735]: I0223 00:18:32.147358 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-t22rv" podStartSLOduration=2.306386612 podStartE2EDuration="14.147342522s" podCreationTimestamp="2026-02-23 00:18:18 +0000 UTC" firstStartedPulling="2026-02-23 00:18:19.520865088 +0000 UTC m=+657.984411089" lastFinishedPulling="2026-02-23 00:18:31.361821028 +0000 UTC m=+669.825366999" observedRunningTime="2026-02-23 00:18:32.143311872 +0000 UTC m=+670.606857843" watchObservedRunningTime="2026-02-23 00:18:32.147342522 +0000 UTC m=+670.610888493" Feb 23 00:18:32 crc kubenswrapper[4735]: I0223 00:18:32.157755 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7d8ddb755b-qzw7j" podStartSLOduration=2.368740956 podStartE2EDuration="14.157738577s" podCreationTimestamp="2026-02-23 00:18:18 +0000 UTC" firstStartedPulling="2026-02-23 00:18:19.573434971 +0000 UTC m=+658.036980952" lastFinishedPulling="2026-02-23 00:18:31.362432612 +0000 UTC m=+669.825978573" observedRunningTime="2026-02-23 00:18:32.15615523 +0000 UTC m=+670.619701201" watchObservedRunningTime="2026-02-23 00:18:32.157738577 +0000 UTC m=+670.621284548" Feb 23 00:18:35 crc kubenswrapper[4735]: I0223 00:18:35.856645 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jm2wf"] Feb 23 00:18:35 crc kubenswrapper[4735]: I0223 00:18:35.857828 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jm2wf" Feb 23 00:18:35 crc kubenswrapper[4735]: I0223 00:18:35.866006 4735 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-2ff54" Feb 23 00:18:35 crc kubenswrapper[4735]: I0223 00:18:35.868494 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Feb 23 00:18:35 crc kubenswrapper[4735]: I0223 00:18:35.868744 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Feb 23 00:18:35 crc kubenswrapper[4735]: I0223 00:18:35.876665 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jm2wf"] Feb 23 00:18:35 crc kubenswrapper[4735]: I0223 00:18:35.958564 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/08df1ac4-d045-4145-ae0c-18c978c9c91e-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-jm2wf\" (UID: \"08df1ac4-d045-4145-ae0c-18c978c9c91e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jm2wf" Feb 23 00:18:35 crc kubenswrapper[4735]: I0223 00:18:35.958652 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj462\" (UniqueName: \"kubernetes.io/projected/08df1ac4-d045-4145-ae0c-18c978c9c91e-kube-api-access-mj462\") pod \"cert-manager-operator-controller-manager-5586865c96-jm2wf\" (UID: \"08df1ac4-d045-4145-ae0c-18c978c9c91e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jm2wf" Feb 23 00:18:36 crc kubenswrapper[4735]: I0223 00:18:36.059755 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/08df1ac4-d045-4145-ae0c-18c978c9c91e-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-jm2wf\" (UID: \"08df1ac4-d045-4145-ae0c-18c978c9c91e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jm2wf" Feb 23 00:18:36 crc kubenswrapper[4735]: I0223 00:18:36.059831 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj462\" (UniqueName: \"kubernetes.io/projected/08df1ac4-d045-4145-ae0c-18c978c9c91e-kube-api-access-mj462\") pod \"cert-manager-operator-controller-manager-5586865c96-jm2wf\" (UID: \"08df1ac4-d045-4145-ae0c-18c978c9c91e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jm2wf" Feb 23 00:18:36 crc kubenswrapper[4735]: I0223 00:18:36.060272 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/08df1ac4-d045-4145-ae0c-18c978c9c91e-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-jm2wf\" (UID: \"08df1ac4-d045-4145-ae0c-18c978c9c91e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jm2wf" Feb 23 00:18:36 crc kubenswrapper[4735]: I0223 00:18:36.101664 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj462\" (UniqueName: \"kubernetes.io/projected/08df1ac4-d045-4145-ae0c-18c978c9c91e-kube-api-access-mj462\") pod \"cert-manager-operator-controller-manager-5586865c96-jm2wf\" (UID: \"08df1ac4-d045-4145-ae0c-18c978c9c91e\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jm2wf" Feb 23 00:18:36 crc kubenswrapper[4735]: I0223 00:18:36.174709 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jm2wf" Feb 23 00:18:36 crc kubenswrapper[4735]: I0223 00:18:36.992637 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jm2wf"] Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.072915 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jm2wf" event={"ID":"08df1ac4-d045-4145-ae0c-18c978c9c91e","Type":"ContainerStarted","Data":"b802071a153bf71aef82d54411cf0b8c186a33a609f12a3ff80426f18ad02ef0"} Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.198951 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.200215 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.203915 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-internal-users" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.203936 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-remote-ca" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.204321 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-http-certs-internal" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.204542 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-transport-certs" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.205807 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-scripts" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.206379 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-unicast-hosts" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.206457 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-config" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.206526 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-xpack-file-realm" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.206677 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-dockercfg-x8lgs" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.243592 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.275080 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.275124 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/aafde9e6-828a-44a3-b5e9-8ec98a576b23-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.275141 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.275169 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.275193 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.275377 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.275448 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.275472 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.275527 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.275549 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.275565 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/aafde9e6-828a-44a3-b5e9-8ec98a576b23-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.275631 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.275678 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.275732 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.275759 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.377151 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.377240 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.377289 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.377334 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.377513 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/aafde9e6-828a-44a3-b5e9-8ec98a576b23-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.378322 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.378382 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.378413 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.378445 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.378490 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.378523 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.378564 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.378601 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.378633 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/aafde9e6-828a-44a3-b5e9-8ec98a576b23-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.378681 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.378966 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.379145 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.379223 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.379279 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.380260 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.380677 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.381016 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/aafde9e6-828a-44a3-b5e9-8ec98a576b23-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.384120 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.384813 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.388064 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/aafde9e6-828a-44a3-b5e9-8ec98a576b23-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.388545 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.388695 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.389216 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.392604 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.394174 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/aafde9e6-828a-44a3-b5e9-8ec98a576b23-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"aafde9e6-828a-44a3-b5e9-8ec98a576b23\") " pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.522709 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:37 crc kubenswrapper[4735]: I0223 00:18:37.845382 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Feb 23 00:18:37 crc kubenswrapper[4735]: W0223 00:18:37.871336 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaafde9e6_828a_44a3_b5e9_8ec98a576b23.slice/crio-847083910b7a751b1b2c18088f09b4c9e96340abee87d86f8181f0416fbd00ce WatchSource:0}: Error finding container 847083910b7a751b1b2c18088f09b4c9e96340abee87d86f8181f0416fbd00ce: Status 404 returned error can't find the container with id 847083910b7a751b1b2c18088f09b4c9e96340abee87d86f8181f0416fbd00ce Feb 23 00:18:38 crc kubenswrapper[4735]: I0223 00:18:38.078919 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"aafde9e6-828a-44a3-b5e9-8ec98a576b23","Type":"ContainerStarted","Data":"847083910b7a751b1b2c18088f09b4c9e96340abee87d86f8181f0416fbd00ce"} Feb 23 00:18:39 crc kubenswrapper[4735]: I0223 00:18:39.288721 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-vm7rb" Feb 23 00:18:46 crc kubenswrapper[4735]: I0223 00:18:46.130813 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jm2wf" event={"ID":"08df1ac4-d045-4145-ae0c-18c978c9c91e","Type":"ContainerStarted","Data":"175048e4b8385c3c68c45f4420faf341bf4b31194e1ce1aa93c40f48434bb08b"} Feb 23 00:18:46 crc kubenswrapper[4735]: I0223 00:18:46.156624 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-jm2wf" podStartSLOduration=3.038743129 podStartE2EDuration="11.156608544s" podCreationTimestamp="2026-02-23 00:18:35 +0000 UTC" firstStartedPulling="2026-02-23 00:18:36.998180089 +0000 UTC m=+675.461726060" lastFinishedPulling="2026-02-23 00:18:45.116045504 +0000 UTC m=+683.579591475" observedRunningTime="2026-02-23 00:18:46.153142075 +0000 UTC m=+684.616688056" watchObservedRunningTime="2026-02-23 00:18:46.156608544 +0000 UTC m=+684.620154515" Feb 23 00:18:52 crc kubenswrapper[4735]: I0223 00:18:52.367519 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-lqrn9"] Feb 23 00:18:52 crc kubenswrapper[4735]: I0223 00:18:52.368521 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-lqrn9" Feb 23 00:18:52 crc kubenswrapper[4735]: I0223 00:18:52.372501 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 23 00:18:52 crc kubenswrapper[4735]: I0223 00:18:52.372581 4735 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-w9ql5" Feb 23 00:18:52 crc kubenswrapper[4735]: I0223 00:18:52.375368 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 23 00:18:52 crc kubenswrapper[4735]: I0223 00:18:52.380334 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-lqrn9"] Feb 23 00:18:52 crc kubenswrapper[4735]: I0223 00:18:52.474944 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f59de9f-a682-4d8d-87d7-8f36ef7c6fa8-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-lqrn9\" (UID: \"2f59de9f-a682-4d8d-87d7-8f36ef7c6fa8\") " pod="cert-manager/cert-manager-cainjector-5545bd876-lqrn9" Feb 23 00:18:52 crc kubenswrapper[4735]: I0223 00:18:52.475012 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79qg9\" (UniqueName: \"kubernetes.io/projected/2f59de9f-a682-4d8d-87d7-8f36ef7c6fa8-kube-api-access-79qg9\") pod \"cert-manager-cainjector-5545bd876-lqrn9\" (UID: \"2f59de9f-a682-4d8d-87d7-8f36ef7c6fa8\") " pod="cert-manager/cert-manager-cainjector-5545bd876-lqrn9" Feb 23 00:18:52 crc kubenswrapper[4735]: I0223 00:18:52.576317 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79qg9\" (UniqueName: \"kubernetes.io/projected/2f59de9f-a682-4d8d-87d7-8f36ef7c6fa8-kube-api-access-79qg9\") pod \"cert-manager-cainjector-5545bd876-lqrn9\" (UID: \"2f59de9f-a682-4d8d-87d7-8f36ef7c6fa8\") " pod="cert-manager/cert-manager-cainjector-5545bd876-lqrn9" Feb 23 00:18:52 crc kubenswrapper[4735]: I0223 00:18:52.576421 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f59de9f-a682-4d8d-87d7-8f36ef7c6fa8-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-lqrn9\" (UID: \"2f59de9f-a682-4d8d-87d7-8f36ef7c6fa8\") " pod="cert-manager/cert-manager-cainjector-5545bd876-lqrn9" Feb 23 00:18:52 crc kubenswrapper[4735]: I0223 00:18:52.608448 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f59de9f-a682-4d8d-87d7-8f36ef7c6fa8-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-lqrn9\" (UID: \"2f59de9f-a682-4d8d-87d7-8f36ef7c6fa8\") " pod="cert-manager/cert-manager-cainjector-5545bd876-lqrn9" Feb 23 00:18:52 crc kubenswrapper[4735]: I0223 00:18:52.609287 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79qg9\" (UniqueName: \"kubernetes.io/projected/2f59de9f-a682-4d8d-87d7-8f36ef7c6fa8-kube-api-access-79qg9\") pod \"cert-manager-cainjector-5545bd876-lqrn9\" (UID: \"2f59de9f-a682-4d8d-87d7-8f36ef7c6fa8\") " pod="cert-manager/cert-manager-cainjector-5545bd876-lqrn9" Feb 23 00:18:52 crc kubenswrapper[4735]: I0223 00:18:52.684093 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-lqrn9" Feb 23 00:18:54 crc kubenswrapper[4735]: I0223 00:18:54.126581 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-lqrn9"] Feb 23 00:18:54 crc kubenswrapper[4735]: W0223 00:18:54.129959 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f59de9f_a682_4d8d_87d7_8f36ef7c6fa8.slice/crio-7786b744fa7e03d09f36b3397810a7884ebf8f3ae73662b4066abb0555097a37 WatchSource:0}: Error finding container 7786b744fa7e03d09f36b3397810a7884ebf8f3ae73662b4066abb0555097a37: Status 404 returned error can't find the container with id 7786b744fa7e03d09f36b3397810a7884ebf8f3ae73662b4066abb0555097a37 Feb 23 00:18:54 crc kubenswrapper[4735]: I0223 00:18:54.183511 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"aafde9e6-828a-44a3-b5e9-8ec98a576b23","Type":"ContainerStarted","Data":"d25aba20c50acd0a7c51fdd86909d603e34da294ae48bd52f09cda76d8b9d81e"} Feb 23 00:18:54 crc kubenswrapper[4735]: I0223 00:18:54.188135 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-lqrn9" event={"ID":"2f59de9f-a682-4d8d-87d7-8f36ef7c6fa8","Type":"ContainerStarted","Data":"7786b744fa7e03d09f36b3397810a7884ebf8f3ae73662b4066abb0555097a37"} Feb 23 00:18:54 crc kubenswrapper[4735]: I0223 00:18:54.391155 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Feb 23 00:18:54 crc kubenswrapper[4735]: I0223 00:18:54.415986 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Feb 23 00:18:55 crc kubenswrapper[4735]: I0223 00:18:55.234262 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-28whv"] Feb 23 00:18:55 crc kubenswrapper[4735]: I0223 00:18:55.235144 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-28whv" Feb 23 00:18:55 crc kubenswrapper[4735]: I0223 00:18:55.238489 4735 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-d4hxq" Feb 23 00:18:55 crc kubenswrapper[4735]: I0223 00:18:55.244047 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-28whv"] Feb 23 00:18:55 crc kubenswrapper[4735]: I0223 00:18:55.318025 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qfh6\" (UniqueName: \"kubernetes.io/projected/36e0600e-f4a8-4c41-988a-f64e2a5db19f-kube-api-access-8qfh6\") pod \"cert-manager-webhook-6888856db4-28whv\" (UID: \"36e0600e-f4a8-4c41-988a-f64e2a5db19f\") " pod="cert-manager/cert-manager-webhook-6888856db4-28whv" Feb 23 00:18:55 crc kubenswrapper[4735]: I0223 00:18:55.318069 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/36e0600e-f4a8-4c41-988a-f64e2a5db19f-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-28whv\" (UID: \"36e0600e-f4a8-4c41-988a-f64e2a5db19f\") " pod="cert-manager/cert-manager-webhook-6888856db4-28whv" Feb 23 00:18:55 crc kubenswrapper[4735]: I0223 00:18:55.419478 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qfh6\" (UniqueName: \"kubernetes.io/projected/36e0600e-f4a8-4c41-988a-f64e2a5db19f-kube-api-access-8qfh6\") pod \"cert-manager-webhook-6888856db4-28whv\" (UID: \"36e0600e-f4a8-4c41-988a-f64e2a5db19f\") " pod="cert-manager/cert-manager-webhook-6888856db4-28whv" Feb 23 00:18:55 crc kubenswrapper[4735]: I0223 00:18:55.419564 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/36e0600e-f4a8-4c41-988a-f64e2a5db19f-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-28whv\" (UID: \"36e0600e-f4a8-4c41-988a-f64e2a5db19f\") " pod="cert-manager/cert-manager-webhook-6888856db4-28whv" Feb 23 00:18:55 crc kubenswrapper[4735]: I0223 00:18:55.454035 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qfh6\" (UniqueName: \"kubernetes.io/projected/36e0600e-f4a8-4c41-988a-f64e2a5db19f-kube-api-access-8qfh6\") pod \"cert-manager-webhook-6888856db4-28whv\" (UID: \"36e0600e-f4a8-4c41-988a-f64e2a5db19f\") " pod="cert-manager/cert-manager-webhook-6888856db4-28whv" Feb 23 00:18:55 crc kubenswrapper[4735]: I0223 00:18:55.454159 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/36e0600e-f4a8-4c41-988a-f64e2a5db19f-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-28whv\" (UID: \"36e0600e-f4a8-4c41-988a-f64e2a5db19f\") " pod="cert-manager/cert-manager-webhook-6888856db4-28whv" Feb 23 00:18:55 crc kubenswrapper[4735]: I0223 00:18:55.552811 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-28whv" Feb 23 00:18:56 crc kubenswrapper[4735]: I0223 00:18:56.028787 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-28whv"] Feb 23 00:18:56 crc kubenswrapper[4735]: I0223 00:18:56.201683 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-28whv" event={"ID":"36e0600e-f4a8-4c41-988a-f64e2a5db19f","Type":"ContainerStarted","Data":"a7b5991d0792ae12979f2d7c6c13da0c48ad10da8a72b0cb1370cea601d3542e"} Feb 23 00:18:56 crc kubenswrapper[4735]: I0223 00:18:56.204333 4735 generic.go:334] "Generic (PLEG): container finished" podID="aafde9e6-828a-44a3-b5e9-8ec98a576b23" containerID="d25aba20c50acd0a7c51fdd86909d603e34da294ae48bd52f09cda76d8b9d81e" exitCode=0 Feb 23 00:18:56 crc kubenswrapper[4735]: I0223 00:18:56.204550 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"aafde9e6-828a-44a3-b5e9-8ec98a576b23","Type":"ContainerDied","Data":"d25aba20c50acd0a7c51fdd86909d603e34da294ae48bd52f09cda76d8b9d81e"} Feb 23 00:18:57 crc kubenswrapper[4735]: I0223 00:18:57.213495 4735 generic.go:334] "Generic (PLEG): container finished" podID="aafde9e6-828a-44a3-b5e9-8ec98a576b23" containerID="0373a3032b418bc04ecf1b1c911537f1cc01e63fb263880a0355626537127d73" exitCode=0 Feb 23 00:18:57 crc kubenswrapper[4735]: I0223 00:18:57.213901 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"aafde9e6-828a-44a3-b5e9-8ec98a576b23","Type":"ContainerDied","Data":"0373a3032b418bc04ecf1b1c911537f1cc01e63fb263880a0355626537127d73"} Feb 23 00:18:58 crc kubenswrapper[4735]: I0223 00:18:58.226675 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"aafde9e6-828a-44a3-b5e9-8ec98a576b23","Type":"ContainerStarted","Data":"491a461bce48e68ace3eee4083febb31fb7ddabf021be7e5e308faf44a1131a1"} Feb 23 00:18:58 crc kubenswrapper[4735]: I0223 00:18:58.227054 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:18:58 crc kubenswrapper[4735]: I0223 00:18:58.261438 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elasticsearch-es-default-0" podStartSLOduration=5.354485891 podStartE2EDuration="21.261422727s" podCreationTimestamp="2026-02-23 00:18:37 +0000 UTC" firstStartedPulling="2026-02-23 00:18:37.873718168 +0000 UTC m=+676.337264139" lastFinishedPulling="2026-02-23 00:18:53.780655004 +0000 UTC m=+692.244200975" observedRunningTime="2026-02-23 00:18:58.26021551 +0000 UTC m=+696.723761481" watchObservedRunningTime="2026-02-23 00:18:58.261422727 +0000 UTC m=+696.724968698" Feb 23 00:19:00 crc kubenswrapper[4735]: I0223 00:19:00.238249 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-28whv" event={"ID":"36e0600e-f4a8-4c41-988a-f64e2a5db19f","Type":"ContainerStarted","Data":"6933bedcd73e16b3f749544c9a261b1828bcd55fabfdbe7ae09d31b2adf0336d"} Feb 23 00:19:00 crc kubenswrapper[4735]: I0223 00:19:00.238314 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-28whv" Feb 23 00:19:00 crc kubenswrapper[4735]: I0223 00:19:00.240986 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-lqrn9" event={"ID":"2f59de9f-a682-4d8d-87d7-8f36ef7c6fa8","Type":"ContainerStarted","Data":"d18da3f7c63eb76f36ec1cc0c1ff040276eac773a319c1f0d745950363736009"} Feb 23 00:19:00 crc kubenswrapper[4735]: I0223 00:19:00.273830 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-28whv" podStartSLOduration=1.321021952 podStartE2EDuration="5.273805992s" podCreationTimestamp="2026-02-23 00:18:55 +0000 UTC" firstStartedPulling="2026-02-23 00:18:56.040266358 +0000 UTC m=+694.503812339" lastFinishedPulling="2026-02-23 00:18:59.993050368 +0000 UTC m=+698.456596379" observedRunningTime="2026-02-23 00:19:00.267551771 +0000 UTC m=+698.731097742" watchObservedRunningTime="2026-02-23 00:19:00.273805992 +0000 UTC m=+698.737351973" Feb 23 00:19:00 crc kubenswrapper[4735]: I0223 00:19:00.292238 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-lqrn9" podStartSLOduration=2.458030937 podStartE2EDuration="8.292223497s" podCreationTimestamp="2026-02-23 00:18:52 +0000 UTC" firstStartedPulling="2026-02-23 00:18:54.132565357 +0000 UTC m=+692.596111328" lastFinishedPulling="2026-02-23 00:18:59.966757907 +0000 UTC m=+698.430303888" observedRunningTime="2026-02-23 00:19:00.290819195 +0000 UTC m=+698.754365166" watchObservedRunningTime="2026-02-23 00:19:00.292223497 +0000 UTC m=+698.755769468" Feb 23 00:19:05 crc kubenswrapper[4735]: I0223 00:19:05.556117 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-28whv" Feb 23 00:19:07 crc kubenswrapper[4735]: I0223 00:19:07.618773 4735 prober.go:107] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="aafde9e6-828a-44a3-b5e9-8ec98a576b23" containerName="elasticsearch" probeResult="failure" output=< Feb 23 00:19:07 crc kubenswrapper[4735]: {"timestamp": "2026-02-23T00:19:07+00:00", "message": "readiness probe failed", "curl_rc": "7"} Feb 23 00:19:07 crc kubenswrapper[4735]: > Feb 23 00:19:08 crc kubenswrapper[4735]: I0223 00:19:08.593114 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-qxczd"] Feb 23 00:19:08 crc kubenswrapper[4735]: I0223 00:19:08.594557 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-qxczd" Feb 23 00:19:08 crc kubenswrapper[4735]: I0223 00:19:08.602185 4735 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-cqmj2" Feb 23 00:19:08 crc kubenswrapper[4735]: I0223 00:19:08.614774 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-qxczd"] Feb 23 00:19:08 crc kubenswrapper[4735]: I0223 00:19:08.719898 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/70509649-f213-44de-83a0-1be5da4e7a13-bound-sa-token\") pod \"cert-manager-545d4d4674-qxczd\" (UID: \"70509649-f213-44de-83a0-1be5da4e7a13\") " pod="cert-manager/cert-manager-545d4d4674-qxczd" Feb 23 00:19:08 crc kubenswrapper[4735]: I0223 00:19:08.720550 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-589p4\" (UniqueName: \"kubernetes.io/projected/70509649-f213-44de-83a0-1be5da4e7a13-kube-api-access-589p4\") pod \"cert-manager-545d4d4674-qxczd\" (UID: \"70509649-f213-44de-83a0-1be5da4e7a13\") " pod="cert-manager/cert-manager-545d4d4674-qxczd" Feb 23 00:19:08 crc kubenswrapper[4735]: I0223 00:19:08.822006 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/70509649-f213-44de-83a0-1be5da4e7a13-bound-sa-token\") pod \"cert-manager-545d4d4674-qxczd\" (UID: \"70509649-f213-44de-83a0-1be5da4e7a13\") " pod="cert-manager/cert-manager-545d4d4674-qxczd" Feb 23 00:19:08 crc kubenswrapper[4735]: I0223 00:19:08.822176 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-589p4\" (UniqueName: \"kubernetes.io/projected/70509649-f213-44de-83a0-1be5da4e7a13-kube-api-access-589p4\") pod \"cert-manager-545d4d4674-qxczd\" (UID: \"70509649-f213-44de-83a0-1be5da4e7a13\") " pod="cert-manager/cert-manager-545d4d4674-qxczd" Feb 23 00:19:08 crc kubenswrapper[4735]: I0223 00:19:08.852282 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-589p4\" (UniqueName: \"kubernetes.io/projected/70509649-f213-44de-83a0-1be5da4e7a13-kube-api-access-589p4\") pod \"cert-manager-545d4d4674-qxczd\" (UID: \"70509649-f213-44de-83a0-1be5da4e7a13\") " pod="cert-manager/cert-manager-545d4d4674-qxczd" Feb 23 00:19:08 crc kubenswrapper[4735]: I0223 00:19:08.856192 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/70509649-f213-44de-83a0-1be5da4e7a13-bound-sa-token\") pod \"cert-manager-545d4d4674-qxczd\" (UID: \"70509649-f213-44de-83a0-1be5da4e7a13\") " pod="cert-manager/cert-manager-545d4d4674-qxczd" Feb 23 00:19:08 crc kubenswrapper[4735]: I0223 00:19:08.916828 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-qxczd" Feb 23 00:19:09 crc kubenswrapper[4735]: I0223 00:19:09.201869 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-qxczd"] Feb 23 00:19:09 crc kubenswrapper[4735]: I0223 00:19:09.311779 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-qxczd" event={"ID":"70509649-f213-44de-83a0-1be5da4e7a13","Type":"ContainerStarted","Data":"fb99fdc94e48c06c6b1e0a9e4c4a3884f9425828642700c86eee40e3a14089c7"} Feb 23 00:19:10 crc kubenswrapper[4735]: I0223 00:19:10.320548 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-qxczd" event={"ID":"70509649-f213-44de-83a0-1be5da4e7a13","Type":"ContainerStarted","Data":"986c51e8d42189167828604e06e8918210bfe059580a104de9729655beb91b42"} Feb 23 00:19:10 crc kubenswrapper[4735]: I0223 00:19:10.342414 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-qxczd" podStartSLOduration=2.342394205 podStartE2EDuration="2.342394205s" podCreationTimestamp="2026-02-23 00:19:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:19:10.341558936 +0000 UTC m=+708.805105007" watchObservedRunningTime="2026-02-23 00:19:10.342394205 +0000 UTC m=+708.805940186" Feb 23 00:19:13 crc kubenswrapper[4735]: I0223 00:19:13.079152 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/elasticsearch-es-default-0" Feb 23 00:19:33 crc kubenswrapper[4735]: I0223 00:19:33.932608 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-smart-gateway-operator-bundle-nightly-head"] Feb 23 00:19:33 crc kubenswrapper[4735]: I0223 00:19:33.935270 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-smart-gateway-operator-bundle-nightly-head" Feb 23 00:19:33 crc kubenswrapper[4735]: I0223 00:19:33.939487 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-catalog-configmap-partition-1" Feb 23 00:19:33 crc kubenswrapper[4735]: I0223 00:19:33.942545 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-smart-gateway-operator-bundle-nightly-head"] Feb 23 00:19:34 crc kubenswrapper[4735]: I0223 00:19:34.073774 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"smart-gateway-operator-catalog-configmap-partition-1-volume\" (UniqueName: \"kubernetes.io/configmap/a20e4468-f688-479b-98e1-df76a5142b19-smart-gateway-operator-catalog-configmap-partition-1-volume\") pod \"infrawatch-operators-smart-gateway-operator-bundle-nightly-head\" (UID: \"a20e4468-f688-479b-98e1-df76a5142b19\") " pod="service-telemetry/infrawatch-operators-smart-gateway-operator-bundle-nightly-head" Feb 23 00:19:34 crc kubenswrapper[4735]: I0223 00:19:34.073841 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"smart-gateway-operator-catalog-configmap-partition-1-unzip\" (UniqueName: \"kubernetes.io/empty-dir/a20e4468-f688-479b-98e1-df76a5142b19-smart-gateway-operator-catalog-configmap-partition-1-unzip\") pod \"infrawatch-operators-smart-gateway-operator-bundle-nightly-head\" (UID: \"a20e4468-f688-479b-98e1-df76a5142b19\") " pod="service-telemetry/infrawatch-operators-smart-gateway-operator-bundle-nightly-head" Feb 23 00:19:34 crc kubenswrapper[4735]: I0223 00:19:34.073897 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq8vm\" (UniqueName: \"kubernetes.io/projected/a20e4468-f688-479b-98e1-df76a5142b19-kube-api-access-zq8vm\") pod \"infrawatch-operators-smart-gateway-operator-bundle-nightly-head\" (UID: \"a20e4468-f688-479b-98e1-df76a5142b19\") " pod="service-telemetry/infrawatch-operators-smart-gateway-operator-bundle-nightly-head" Feb 23 00:19:34 crc kubenswrapper[4735]: I0223 00:19:34.175225 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"smart-gateway-operator-catalog-configmap-partition-1-unzip\" (UniqueName: \"kubernetes.io/empty-dir/a20e4468-f688-479b-98e1-df76a5142b19-smart-gateway-operator-catalog-configmap-partition-1-unzip\") pod \"infrawatch-operators-smart-gateway-operator-bundle-nightly-head\" (UID: \"a20e4468-f688-479b-98e1-df76a5142b19\") " pod="service-telemetry/infrawatch-operators-smart-gateway-operator-bundle-nightly-head" Feb 23 00:19:34 crc kubenswrapper[4735]: I0223 00:19:34.175916 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq8vm\" (UniqueName: \"kubernetes.io/projected/a20e4468-f688-479b-98e1-df76a5142b19-kube-api-access-zq8vm\") pod \"infrawatch-operators-smart-gateway-operator-bundle-nightly-head\" (UID: \"a20e4468-f688-479b-98e1-df76a5142b19\") " pod="service-telemetry/infrawatch-operators-smart-gateway-operator-bundle-nightly-head" Feb 23 00:19:34 crc kubenswrapper[4735]: I0223 00:19:34.176093 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"smart-gateway-operator-catalog-configmap-partition-1-unzip\" (UniqueName: \"kubernetes.io/empty-dir/a20e4468-f688-479b-98e1-df76a5142b19-smart-gateway-operator-catalog-configmap-partition-1-unzip\") pod \"infrawatch-operators-smart-gateway-operator-bundle-nightly-head\" (UID: \"a20e4468-f688-479b-98e1-df76a5142b19\") " pod="service-telemetry/infrawatch-operators-smart-gateway-operator-bundle-nightly-head" Feb 23 00:19:34 crc kubenswrapper[4735]: I0223 00:19:34.176127 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"smart-gateway-operator-catalog-configmap-partition-1-volume\" (UniqueName: \"kubernetes.io/configmap/a20e4468-f688-479b-98e1-df76a5142b19-smart-gateway-operator-catalog-configmap-partition-1-volume\") pod \"infrawatch-operators-smart-gateway-operator-bundle-nightly-head\" (UID: \"a20e4468-f688-479b-98e1-df76a5142b19\") " pod="service-telemetry/infrawatch-operators-smart-gateway-operator-bundle-nightly-head" Feb 23 00:19:34 crc kubenswrapper[4735]: I0223 00:19:34.177189 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"smart-gateway-operator-catalog-configmap-partition-1-volume\" (UniqueName: \"kubernetes.io/configmap/a20e4468-f688-479b-98e1-df76a5142b19-smart-gateway-operator-catalog-configmap-partition-1-volume\") pod \"infrawatch-operators-smart-gateway-operator-bundle-nightly-head\" (UID: \"a20e4468-f688-479b-98e1-df76a5142b19\") " pod="service-telemetry/infrawatch-operators-smart-gateway-operator-bundle-nightly-head" Feb 23 00:19:34 crc kubenswrapper[4735]: I0223 00:19:34.220116 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq8vm\" (UniqueName: \"kubernetes.io/projected/a20e4468-f688-479b-98e1-df76a5142b19-kube-api-access-zq8vm\") pod \"infrawatch-operators-smart-gateway-operator-bundle-nightly-head\" (UID: \"a20e4468-f688-479b-98e1-df76a5142b19\") " pod="service-telemetry/infrawatch-operators-smart-gateway-operator-bundle-nightly-head" Feb 23 00:19:34 crc kubenswrapper[4735]: I0223 00:19:34.265420 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-smart-gateway-operator-bundle-nightly-head" Feb 23 00:19:34 crc kubenswrapper[4735]: I0223 00:19:34.537451 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-smart-gateway-operator-bundle-nightly-head"] Feb 23 00:19:35 crc kubenswrapper[4735]: I0223 00:19:35.512216 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-smart-gateway-operator-bundle-nightly-head" event={"ID":"a20e4468-f688-479b-98e1-df76a5142b19","Type":"ContainerStarted","Data":"550c082e9a66cacbbc92dc030e671e5ed7bd431828d1474675ee197ef0a12650"} Feb 23 00:19:40 crc kubenswrapper[4735]: I0223 00:19:40.548103 4735 generic.go:334] "Generic (PLEG): container finished" podID="a20e4468-f688-479b-98e1-df76a5142b19" containerID="5221ced87de1a9cd46ccfb5635dbb42f410c22d7f4d63c0793aaca9233a4571a" exitCode=0 Feb 23 00:19:40 crc kubenswrapper[4735]: I0223 00:19:40.548228 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-smart-gateway-operator-bundle-nightly-head" event={"ID":"a20e4468-f688-479b-98e1-df76a5142b19","Type":"ContainerDied","Data":"5221ced87de1a9cd46ccfb5635dbb42f410c22d7f4d63c0793aaca9233a4571a"} Feb 23 00:19:41 crc kubenswrapper[4735]: I0223 00:19:41.512953 4735 patch_prober.go:28] interesting pod/machine-config-daemon-blmnv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 00:19:41 crc kubenswrapper[4735]: I0223 00:19:41.513583 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" podUID="1cba474f-2d55-4a07-969f-25e2817a06d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 00:19:43 crc kubenswrapper[4735]: I0223 00:19:43.575822 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-smart-gateway-operator-bundle-nightly-head" event={"ID":"a20e4468-f688-479b-98e1-df76a5142b19","Type":"ContainerStarted","Data":"9031fc783998bb01c611af9c0751e77de2b71b01d1f0f0a6aa2e3057c3ad450d"} Feb 23 00:19:43 crc kubenswrapper[4735]: I0223 00:19:43.599539 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-smart-gateway-operator-bundle-nightly-head" podStartSLOduration=2.19405573 podStartE2EDuration="10.599523574s" podCreationTimestamp="2026-02-23 00:19:33 +0000 UTC" firstStartedPulling="2026-02-23 00:19:34.55004016 +0000 UTC m=+733.013586151" lastFinishedPulling="2026-02-23 00:19:42.955507984 +0000 UTC m=+741.419053995" observedRunningTime="2026-02-23 00:19:43.595448563 +0000 UTC m=+742.058994574" watchObservedRunningTime="2026-02-23 00:19:43.599523574 +0000 UTC m=+742.063069545" Feb 23 00:19:44 crc kubenswrapper[4735]: I0223 00:19:44.593136 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/581064c273eeb770c9fbc3e03ee675cb542f06b12d97607b3aad97666175svb"] Feb 23 00:19:44 crc kubenswrapper[4735]: I0223 00:19:44.594730 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/581064c273eeb770c9fbc3e03ee675cb542f06b12d97607b3aad97666175svb" Feb 23 00:19:44 crc kubenswrapper[4735]: I0223 00:19:44.631253 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/581064c273eeb770c9fbc3e03ee675cb542f06b12d97607b3aad97666175svb"] Feb 23 00:19:44 crc kubenswrapper[4735]: I0223 00:19:44.720088 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4cd9\" (UniqueName: \"kubernetes.io/projected/c2b00646-a6f5-437a-8204-b4fe9297844b-kube-api-access-d4cd9\") pod \"581064c273eeb770c9fbc3e03ee675cb542f06b12d97607b3aad97666175svb\" (UID: \"c2b00646-a6f5-437a-8204-b4fe9297844b\") " pod="service-telemetry/581064c273eeb770c9fbc3e03ee675cb542f06b12d97607b3aad97666175svb" Feb 23 00:19:44 crc kubenswrapper[4735]: I0223 00:19:44.720157 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c2b00646-a6f5-437a-8204-b4fe9297844b-bundle\") pod \"581064c273eeb770c9fbc3e03ee675cb542f06b12d97607b3aad97666175svb\" (UID: \"c2b00646-a6f5-437a-8204-b4fe9297844b\") " pod="service-telemetry/581064c273eeb770c9fbc3e03ee675cb542f06b12d97607b3aad97666175svb" Feb 23 00:19:44 crc kubenswrapper[4735]: I0223 00:19:44.720196 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c2b00646-a6f5-437a-8204-b4fe9297844b-util\") pod \"581064c273eeb770c9fbc3e03ee675cb542f06b12d97607b3aad97666175svb\" (UID: \"c2b00646-a6f5-437a-8204-b4fe9297844b\") " pod="service-telemetry/581064c273eeb770c9fbc3e03ee675cb542f06b12d97607b3aad97666175svb" Feb 23 00:19:44 crc kubenswrapper[4735]: I0223 00:19:44.821799 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c2b00646-a6f5-437a-8204-b4fe9297844b-util\") pod \"581064c273eeb770c9fbc3e03ee675cb542f06b12d97607b3aad97666175svb\" (UID: \"c2b00646-a6f5-437a-8204-b4fe9297844b\") " pod="service-telemetry/581064c273eeb770c9fbc3e03ee675cb542f06b12d97607b3aad97666175svb" Feb 23 00:19:44 crc kubenswrapper[4735]: I0223 00:19:44.821934 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4cd9\" (UniqueName: \"kubernetes.io/projected/c2b00646-a6f5-437a-8204-b4fe9297844b-kube-api-access-d4cd9\") pod \"581064c273eeb770c9fbc3e03ee675cb542f06b12d97607b3aad97666175svb\" (UID: \"c2b00646-a6f5-437a-8204-b4fe9297844b\") " pod="service-telemetry/581064c273eeb770c9fbc3e03ee675cb542f06b12d97607b3aad97666175svb" Feb 23 00:19:44 crc kubenswrapper[4735]: I0223 00:19:44.821964 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c2b00646-a6f5-437a-8204-b4fe9297844b-bundle\") pod \"581064c273eeb770c9fbc3e03ee675cb542f06b12d97607b3aad97666175svb\" (UID: \"c2b00646-a6f5-437a-8204-b4fe9297844b\") " pod="service-telemetry/581064c273eeb770c9fbc3e03ee675cb542f06b12d97607b3aad97666175svb" Feb 23 00:19:44 crc kubenswrapper[4735]: I0223 00:19:44.822511 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c2b00646-a6f5-437a-8204-b4fe9297844b-bundle\") pod \"581064c273eeb770c9fbc3e03ee675cb542f06b12d97607b3aad97666175svb\" (UID: \"c2b00646-a6f5-437a-8204-b4fe9297844b\") " pod="service-telemetry/581064c273eeb770c9fbc3e03ee675cb542f06b12d97607b3aad97666175svb" Feb 23 00:19:44 crc kubenswrapper[4735]: I0223 00:19:44.822654 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c2b00646-a6f5-437a-8204-b4fe9297844b-util\") pod \"581064c273eeb770c9fbc3e03ee675cb542f06b12d97607b3aad97666175svb\" (UID: \"c2b00646-a6f5-437a-8204-b4fe9297844b\") " pod="service-telemetry/581064c273eeb770c9fbc3e03ee675cb542f06b12d97607b3aad97666175svb" Feb 23 00:19:44 crc kubenswrapper[4735]: I0223 00:19:44.842769 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4cd9\" (UniqueName: \"kubernetes.io/projected/c2b00646-a6f5-437a-8204-b4fe9297844b-kube-api-access-d4cd9\") pod \"581064c273eeb770c9fbc3e03ee675cb542f06b12d97607b3aad97666175svb\" (UID: \"c2b00646-a6f5-437a-8204-b4fe9297844b\") " pod="service-telemetry/581064c273eeb770c9fbc3e03ee675cb542f06b12d97607b3aad97666175svb" Feb 23 00:19:44 crc kubenswrapper[4735]: I0223 00:19:44.922092 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/581064c273eeb770c9fbc3e03ee675cb542f06b12d97607b3aad97666175svb" Feb 23 00:19:45 crc kubenswrapper[4735]: I0223 00:19:45.336713 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/581064c273eeb770c9fbc3e03ee675cb542f06b12d97607b3aad97666175svb"] Feb 23 00:19:45 crc kubenswrapper[4735]: W0223 00:19:45.345624 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2b00646_a6f5_437a_8204_b4fe9297844b.slice/crio-79835ac508ad80ce981f931e050b004bb3dc53e0a29db89d9c05ea65e1ed1d92 WatchSource:0}: Error finding container 79835ac508ad80ce981f931e050b004bb3dc53e0a29db89d9c05ea65e1ed1d92: Status 404 returned error can't find the container with id 79835ac508ad80ce981f931e050b004bb3dc53e0a29db89d9c05ea65e1ed1d92 Feb 23 00:19:45 crc kubenswrapper[4735]: I0223 00:19:45.599118 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/581064c273eeb770c9fbc3e03ee675cb542f06b12d97607b3aad97666175svb" event={"ID":"c2b00646-a6f5-437a-8204-b4fe9297844b","Type":"ContainerStarted","Data":"74358b6aedebad2d97d3cd5b4769eefc92c04356ff1a7f024e5ad7d87be1ed07"} Feb 23 00:19:45 crc kubenswrapper[4735]: I0223 00:19:45.599639 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/581064c273eeb770c9fbc3e03ee675cb542f06b12d97607b3aad97666175svb" event={"ID":"c2b00646-a6f5-437a-8204-b4fe9297844b","Type":"ContainerStarted","Data":"79835ac508ad80ce981f931e050b004bb3dc53e0a29db89d9c05ea65e1ed1d92"} Feb 23 00:19:46 crc kubenswrapper[4735]: I0223 00:19:46.610915 4735 generic.go:334] "Generic (PLEG): container finished" podID="c2b00646-a6f5-437a-8204-b4fe9297844b" containerID="74358b6aedebad2d97d3cd5b4769eefc92c04356ff1a7f024e5ad7d87be1ed07" exitCode=0 Feb 23 00:19:46 crc kubenswrapper[4735]: I0223 00:19:46.611077 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/581064c273eeb770c9fbc3e03ee675cb542f06b12d97607b3aad97666175svb" event={"ID":"c2b00646-a6f5-437a-8204-b4fe9297844b","Type":"ContainerDied","Data":"74358b6aedebad2d97d3cd5b4769eefc92c04356ff1a7f024e5ad7d87be1ed07"} Feb 23 00:19:48 crc kubenswrapper[4735]: I0223 00:19:48.630972 4735 generic.go:334] "Generic (PLEG): container finished" podID="c2b00646-a6f5-437a-8204-b4fe9297844b" containerID="fd978addee978ce9e729dd474cba5c68ce0036dd60b1a079c2248ecb0a03c826" exitCode=0 Feb 23 00:19:48 crc kubenswrapper[4735]: I0223 00:19:48.631025 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/581064c273eeb770c9fbc3e03ee675cb542f06b12d97607b3aad97666175svb" event={"ID":"c2b00646-a6f5-437a-8204-b4fe9297844b","Type":"ContainerDied","Data":"fd978addee978ce9e729dd474cba5c68ce0036dd60b1a079c2248ecb0a03c826"} Feb 23 00:19:49 crc kubenswrapper[4735]: I0223 00:19:49.643216 4735 generic.go:334] "Generic (PLEG): container finished" podID="c2b00646-a6f5-437a-8204-b4fe9297844b" containerID="ac80e40c34809ce478b8128d2c807e09764a8184c9a3c17246192112f5784055" exitCode=0 Feb 23 00:19:49 crc kubenswrapper[4735]: I0223 00:19:49.643293 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/581064c273eeb770c9fbc3e03ee675cb542f06b12d97607b3aad97666175svb" event={"ID":"c2b00646-a6f5-437a-8204-b4fe9297844b","Type":"ContainerDied","Data":"ac80e40c34809ce478b8128d2c807e09764a8184c9a3c17246192112f5784055"} Feb 23 00:19:50 crc kubenswrapper[4735]: I0223 00:19:50.988022 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/581064c273eeb770c9fbc3e03ee675cb542f06b12d97607b3aad97666175svb" Feb 23 00:19:51 crc kubenswrapper[4735]: I0223 00:19:51.112410 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c2b00646-a6f5-437a-8204-b4fe9297844b-bundle\") pod \"c2b00646-a6f5-437a-8204-b4fe9297844b\" (UID: \"c2b00646-a6f5-437a-8204-b4fe9297844b\") " Feb 23 00:19:51 crc kubenswrapper[4735]: I0223 00:19:51.112490 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c2b00646-a6f5-437a-8204-b4fe9297844b-util\") pod \"c2b00646-a6f5-437a-8204-b4fe9297844b\" (UID: \"c2b00646-a6f5-437a-8204-b4fe9297844b\") " Feb 23 00:19:51 crc kubenswrapper[4735]: I0223 00:19:51.112540 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4cd9\" (UniqueName: \"kubernetes.io/projected/c2b00646-a6f5-437a-8204-b4fe9297844b-kube-api-access-d4cd9\") pod \"c2b00646-a6f5-437a-8204-b4fe9297844b\" (UID: \"c2b00646-a6f5-437a-8204-b4fe9297844b\") " Feb 23 00:19:51 crc kubenswrapper[4735]: I0223 00:19:51.113499 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2b00646-a6f5-437a-8204-b4fe9297844b-bundle" (OuterVolumeSpecName: "bundle") pod "c2b00646-a6f5-437a-8204-b4fe9297844b" (UID: "c2b00646-a6f5-437a-8204-b4fe9297844b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:19:51 crc kubenswrapper[4735]: I0223 00:19:51.120111 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2b00646-a6f5-437a-8204-b4fe9297844b-kube-api-access-d4cd9" (OuterVolumeSpecName: "kube-api-access-d4cd9") pod "c2b00646-a6f5-437a-8204-b4fe9297844b" (UID: "c2b00646-a6f5-437a-8204-b4fe9297844b"). InnerVolumeSpecName "kube-api-access-d4cd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:19:51 crc kubenswrapper[4735]: I0223 00:19:51.214275 4735 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c2b00646-a6f5-437a-8204-b4fe9297844b-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 00:19:51 crc kubenswrapper[4735]: I0223 00:19:51.214616 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4cd9\" (UniqueName: \"kubernetes.io/projected/c2b00646-a6f5-437a-8204-b4fe9297844b-kube-api-access-d4cd9\") on node \"crc\" DevicePath \"\"" Feb 23 00:19:51 crc kubenswrapper[4735]: I0223 00:19:51.250976 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2b00646-a6f5-437a-8204-b4fe9297844b-util" (OuterVolumeSpecName: "util") pod "c2b00646-a6f5-437a-8204-b4fe9297844b" (UID: "c2b00646-a6f5-437a-8204-b4fe9297844b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:19:51 crc kubenswrapper[4735]: I0223 00:19:51.316384 4735 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c2b00646-a6f5-437a-8204-b4fe9297844b-util\") on node \"crc\" DevicePath \"\"" Feb 23 00:19:51 crc kubenswrapper[4735]: I0223 00:19:51.657743 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/581064c273eeb770c9fbc3e03ee675cb542f06b12d97607b3aad97666175svb" event={"ID":"c2b00646-a6f5-437a-8204-b4fe9297844b","Type":"ContainerDied","Data":"79835ac508ad80ce981f931e050b004bb3dc53e0a29db89d9c05ea65e1ed1d92"} Feb 23 00:19:51 crc kubenswrapper[4735]: I0223 00:19:51.657788 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79835ac508ad80ce981f931e050b004bb3dc53e0a29db89d9c05ea65e1ed1d92" Feb 23 00:19:51 crc kubenswrapper[4735]: I0223 00:19:51.657883 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/581064c273eeb770c9fbc3e03ee675cb542f06b12d97607b3aad97666175svb" Feb 23 00:19:55 crc kubenswrapper[4735]: I0223 00:19:55.407255 4735 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 23 00:19:56 crc kubenswrapper[4735]: I0223 00:19:56.552234 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-bbbc889bc-wbwd6"] Feb 23 00:19:56 crc kubenswrapper[4735]: E0223 00:19:56.552787 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2b00646-a6f5-437a-8204-b4fe9297844b" containerName="util" Feb 23 00:19:56 crc kubenswrapper[4735]: I0223 00:19:56.552807 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2b00646-a6f5-437a-8204-b4fe9297844b" containerName="util" Feb 23 00:19:56 crc kubenswrapper[4735]: E0223 00:19:56.552822 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2b00646-a6f5-437a-8204-b4fe9297844b" containerName="pull" Feb 23 00:19:56 crc kubenswrapper[4735]: I0223 00:19:56.552833 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2b00646-a6f5-437a-8204-b4fe9297844b" containerName="pull" Feb 23 00:19:56 crc kubenswrapper[4735]: E0223 00:19:56.552846 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2b00646-a6f5-437a-8204-b4fe9297844b" containerName="extract" Feb 23 00:19:56 crc kubenswrapper[4735]: I0223 00:19:56.552894 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2b00646-a6f5-437a-8204-b4fe9297844b" containerName="extract" Feb 23 00:19:56 crc kubenswrapper[4735]: I0223 00:19:56.553093 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2b00646-a6f5-437a-8204-b4fe9297844b" containerName="extract" Feb 23 00:19:56 crc kubenswrapper[4735]: I0223 00:19:56.553727 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bbbc889bc-wbwd6" Feb 23 00:19:56 crc kubenswrapper[4735]: I0223 00:19:56.555924 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-operator-dockercfg-6s4ds" Feb 23 00:19:56 crc kubenswrapper[4735]: I0223 00:19:56.566947 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bbbc889bc-wbwd6"] Feb 23 00:19:56 crc kubenswrapper[4735]: I0223 00:19:56.694530 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8lx2\" (UniqueName: \"kubernetes.io/projected/b1475d02-2bda-43e0-adc0-064144972174-kube-api-access-c8lx2\") pod \"smart-gateway-operator-bbbc889bc-wbwd6\" (UID: \"b1475d02-2bda-43e0-adc0-064144972174\") " pod="service-telemetry/smart-gateway-operator-bbbc889bc-wbwd6" Feb 23 00:19:56 crc kubenswrapper[4735]: I0223 00:19:56.694750 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/b1475d02-2bda-43e0-adc0-064144972174-runner\") pod \"smart-gateway-operator-bbbc889bc-wbwd6\" (UID: \"b1475d02-2bda-43e0-adc0-064144972174\") " pod="service-telemetry/smart-gateway-operator-bbbc889bc-wbwd6" Feb 23 00:19:56 crc kubenswrapper[4735]: I0223 00:19:56.796144 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/b1475d02-2bda-43e0-adc0-064144972174-runner\") pod \"smart-gateway-operator-bbbc889bc-wbwd6\" (UID: \"b1475d02-2bda-43e0-adc0-064144972174\") " pod="service-telemetry/smart-gateway-operator-bbbc889bc-wbwd6" Feb 23 00:19:56 crc kubenswrapper[4735]: I0223 00:19:56.796259 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8lx2\" (UniqueName: \"kubernetes.io/projected/b1475d02-2bda-43e0-adc0-064144972174-kube-api-access-c8lx2\") pod \"smart-gateway-operator-bbbc889bc-wbwd6\" (UID: \"b1475d02-2bda-43e0-adc0-064144972174\") " pod="service-telemetry/smart-gateway-operator-bbbc889bc-wbwd6" Feb 23 00:19:56 crc kubenswrapper[4735]: I0223 00:19:56.796691 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/b1475d02-2bda-43e0-adc0-064144972174-runner\") pod \"smart-gateway-operator-bbbc889bc-wbwd6\" (UID: \"b1475d02-2bda-43e0-adc0-064144972174\") " pod="service-telemetry/smart-gateway-operator-bbbc889bc-wbwd6" Feb 23 00:19:56 crc kubenswrapper[4735]: I0223 00:19:56.824544 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8lx2\" (UniqueName: \"kubernetes.io/projected/b1475d02-2bda-43e0-adc0-064144972174-kube-api-access-c8lx2\") pod \"smart-gateway-operator-bbbc889bc-wbwd6\" (UID: \"b1475d02-2bda-43e0-adc0-064144972174\") " pod="service-telemetry/smart-gateway-operator-bbbc889bc-wbwd6" Feb 23 00:19:56 crc kubenswrapper[4735]: I0223 00:19:56.874879 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bbbc889bc-wbwd6" Feb 23 00:19:57 crc kubenswrapper[4735]: I0223 00:19:57.065220 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bbbc889bc-wbwd6"] Feb 23 00:19:57 crc kubenswrapper[4735]: I0223 00:19:57.700667 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bbbc889bc-wbwd6" event={"ID":"b1475d02-2bda-43e0-adc0-064144972174","Type":"ContainerStarted","Data":"8e397332a991658dc45efc92756c1d9e28557eaf4664d4325d40d578e1523477"} Feb 23 00:20:10 crc kubenswrapper[4735]: E0223 00:20:10.331990 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/infrawatch/smart-gateway-operator:latest" Feb 23 00:20:10 crc kubenswrapper[4735]: E0223 00:20:10.332947 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/infrawatch/smart-gateway-operator:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.annotations['olm.targetNamespaces'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_NAME,Value:smart-gateway-operator,ValueFrom:nil,},EnvVar{Name:ANSIBLE_GATHERING,Value:explicit,ValueFrom:nil,},EnvVar{Name:ANSIBLE_VERBOSITY_SMARTGATEWAY_SMARTGATEWAY_INFRA_WATCH,Value:4,ValueFrom:nil,},EnvVar{Name:ANSIBLE_DEBUG_LOGS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CORE_SMARTGATEWAY_IMAGE,Value:quay.io/infrawatch/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BRIDGE_SMARTGATEWAY_IMAGE,Value:quay.io/infrawatch/sg-bridge:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OAUTH_PROXY_IMAGE,Value:quay.io/openshift/origin-oauth-proxy:latest,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:smart-gateway-operator.v5.0.1768085178,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:runner,ReadOnly:false,MountPath:/tmp/ansible-operator/runner,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c8lx2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod smart-gateway-operator-bbbc889bc-wbwd6_service-telemetry(b1475d02-2bda-43e0-adc0-064144972174): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 00:20:10 crc kubenswrapper[4735]: E0223 00:20:10.334158 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/smart-gateway-operator-bbbc889bc-wbwd6" podUID="b1475d02-2bda-43e0-adc0-064144972174" Feb 23 00:20:10 crc kubenswrapper[4735]: E0223 00:20:10.789843 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/infrawatch/smart-gateway-operator:latest\\\"\"" pod="service-telemetry/smart-gateway-operator-bbbc889bc-wbwd6" podUID="b1475d02-2bda-43e0-adc0-064144972174" Feb 23 00:20:11 crc kubenswrapper[4735]: I0223 00:20:11.512528 4735 patch_prober.go:28] interesting pod/machine-config-daemon-blmnv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 00:20:11 crc kubenswrapper[4735]: I0223 00:20:11.512590 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" podUID="1cba474f-2d55-4a07-969f-25e2817a06d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 00:20:26 crc kubenswrapper[4735]: I0223 00:20:26.939442 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bbbc889bc-wbwd6" event={"ID":"b1475d02-2bda-43e0-adc0-064144972174","Type":"ContainerStarted","Data":"bb211130f88cb71238208501c2d78e3677633491e18a0b758b8b1e545b08acab"} Feb 23 00:20:26 crc kubenswrapper[4735]: I0223 00:20:26.968128 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-bbbc889bc-wbwd6" podStartSLOduration=2.27163101 podStartE2EDuration="30.968104572s" podCreationTimestamp="2026-02-23 00:19:56 +0000 UTC" firstStartedPulling="2026-02-23 00:19:57.077995979 +0000 UTC m=+755.541541950" lastFinishedPulling="2026-02-23 00:20:25.774469531 +0000 UTC m=+784.238015512" observedRunningTime="2026-02-23 00:20:26.961828972 +0000 UTC m=+785.425374953" watchObservedRunningTime="2026-02-23 00:20:26.968104572 +0000 UTC m=+785.431650563" Feb 23 00:20:41 crc kubenswrapper[4735]: I0223 00:20:41.512654 4735 patch_prober.go:28] interesting pod/machine-config-daemon-blmnv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 00:20:41 crc kubenswrapper[4735]: I0223 00:20:41.513361 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" podUID="1cba474f-2d55-4a07-969f-25e2817a06d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 00:20:41 crc kubenswrapper[4735]: I0223 00:20:41.513426 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" Feb 23 00:20:41 crc kubenswrapper[4735]: I0223 00:20:41.514344 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6d18ff42046d8089570f691ef425e2b8cbc857ffa40454ed7ce709bd6b34ea17"} pod="openshift-machine-config-operator/machine-config-daemon-blmnv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 00:20:41 crc kubenswrapper[4735]: I0223 00:20:41.514448 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" podUID="1cba474f-2d55-4a07-969f-25e2817a06d0" containerName="machine-config-daemon" containerID="cri-o://6d18ff42046d8089570f691ef425e2b8cbc857ffa40454ed7ce709bd6b34ea17" gracePeriod=600 Feb 23 00:20:42 crc kubenswrapper[4735]: I0223 00:20:42.055122 4735 generic.go:334] "Generic (PLEG): container finished" podID="1cba474f-2d55-4a07-969f-25e2817a06d0" containerID="6d18ff42046d8089570f691ef425e2b8cbc857ffa40454ed7ce709bd6b34ea17" exitCode=0 Feb 23 00:20:42 crc kubenswrapper[4735]: I0223 00:20:42.055184 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" event={"ID":"1cba474f-2d55-4a07-969f-25e2817a06d0","Type":"ContainerDied","Data":"6d18ff42046d8089570f691ef425e2b8cbc857ffa40454ed7ce709bd6b34ea17"} Feb 23 00:20:42 crc kubenswrapper[4735]: I0223 00:20:42.055229 4735 scope.go:117] "RemoveContainer" containerID="9ffff5ef8ce3a35f166e6769e8ce86e4bf9ce64b374895f07abf527d84d7182c" Feb 23 00:20:43 crc kubenswrapper[4735]: I0223 00:20:43.068528 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" event={"ID":"1cba474f-2d55-4a07-969f-25e2817a06d0","Type":"ContainerStarted","Data":"a99b454aab2d4f811a91442480520f4ceedbb717995dcf3a4814eb9e1442c818"} Feb 23 00:20:50 crc kubenswrapper[4735]: I0223 00:20:50.983323 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/awatch-operators-service-telemetry-operator-bundle-nightly-head"] Feb 23 00:20:50 crc kubenswrapper[4735]: I0223 00:20:50.985959 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/awatch-operators-service-telemetry-operator-bundle-nightly-head" Feb 23 00:20:50 crc kubenswrapper[4735]: I0223 00:20:50.990998 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-catalog-configmap-partition-1" Feb 23 00:20:50 crc kubenswrapper[4735]: I0223 00:20:50.996747 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/awatch-operators-service-telemetry-operator-bundle-nightly-head"] Feb 23 00:20:51 crc kubenswrapper[4735]: I0223 00:20:51.009191 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-telemetry-operator-catalog-configmap-partition-1-volume\" (UniqueName: \"kubernetes.io/configmap/266499b6-c905-4095-8f72-afbb761a4ec8-service-telemetry-operator-catalog-configmap-partition-1-volume\") pod \"awatch-operators-service-telemetry-operator-bundle-nightly-head\" (UID: \"266499b6-c905-4095-8f72-afbb761a4ec8\") " pod="service-telemetry/awatch-operators-service-telemetry-operator-bundle-nightly-head" Feb 23 00:20:51 crc kubenswrapper[4735]: I0223 00:20:51.009564 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-telemetry-operator-catalog-configmap-partition-1-unzip\" (UniqueName: \"kubernetes.io/empty-dir/266499b6-c905-4095-8f72-afbb761a4ec8-service-telemetry-operator-catalog-configmap-partition-1-unzip\") pod \"awatch-operators-service-telemetry-operator-bundle-nightly-head\" (UID: \"266499b6-c905-4095-8f72-afbb761a4ec8\") " pod="service-telemetry/awatch-operators-service-telemetry-operator-bundle-nightly-head" Feb 23 00:20:51 crc kubenswrapper[4735]: I0223 00:20:51.009982 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45qrx\" (UniqueName: \"kubernetes.io/projected/266499b6-c905-4095-8f72-afbb761a4ec8-kube-api-access-45qrx\") pod \"awatch-operators-service-telemetry-operator-bundle-nightly-head\" (UID: \"266499b6-c905-4095-8f72-afbb761a4ec8\") " pod="service-telemetry/awatch-operators-service-telemetry-operator-bundle-nightly-head" Feb 23 00:20:51 crc kubenswrapper[4735]: I0223 00:20:51.111881 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45qrx\" (UniqueName: \"kubernetes.io/projected/266499b6-c905-4095-8f72-afbb761a4ec8-kube-api-access-45qrx\") pod \"awatch-operators-service-telemetry-operator-bundle-nightly-head\" (UID: \"266499b6-c905-4095-8f72-afbb761a4ec8\") " pod="service-telemetry/awatch-operators-service-telemetry-operator-bundle-nightly-head" Feb 23 00:20:51 crc kubenswrapper[4735]: I0223 00:20:51.111972 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-telemetry-operator-catalog-configmap-partition-1-volume\" (UniqueName: \"kubernetes.io/configmap/266499b6-c905-4095-8f72-afbb761a4ec8-service-telemetry-operator-catalog-configmap-partition-1-volume\") pod \"awatch-operators-service-telemetry-operator-bundle-nightly-head\" (UID: \"266499b6-c905-4095-8f72-afbb761a4ec8\") " pod="service-telemetry/awatch-operators-service-telemetry-operator-bundle-nightly-head" Feb 23 00:20:51 crc kubenswrapper[4735]: I0223 00:20:51.112033 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-telemetry-operator-catalog-configmap-partition-1-unzip\" (UniqueName: \"kubernetes.io/empty-dir/266499b6-c905-4095-8f72-afbb761a4ec8-service-telemetry-operator-catalog-configmap-partition-1-unzip\") pod \"awatch-operators-service-telemetry-operator-bundle-nightly-head\" (UID: \"266499b6-c905-4095-8f72-afbb761a4ec8\") " pod="service-telemetry/awatch-operators-service-telemetry-operator-bundle-nightly-head" Feb 23 00:20:51 crc kubenswrapper[4735]: I0223 00:20:51.112894 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-telemetry-operator-catalog-configmap-partition-1-unzip\" (UniqueName: \"kubernetes.io/empty-dir/266499b6-c905-4095-8f72-afbb761a4ec8-service-telemetry-operator-catalog-configmap-partition-1-unzip\") pod \"awatch-operators-service-telemetry-operator-bundle-nightly-head\" (UID: \"266499b6-c905-4095-8f72-afbb761a4ec8\") " pod="service-telemetry/awatch-operators-service-telemetry-operator-bundle-nightly-head" Feb 23 00:20:51 crc kubenswrapper[4735]: I0223 00:20:51.113961 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-telemetry-operator-catalog-configmap-partition-1-volume\" (UniqueName: \"kubernetes.io/configmap/266499b6-c905-4095-8f72-afbb761a4ec8-service-telemetry-operator-catalog-configmap-partition-1-volume\") pod \"awatch-operators-service-telemetry-operator-bundle-nightly-head\" (UID: \"266499b6-c905-4095-8f72-afbb761a4ec8\") " pod="service-telemetry/awatch-operators-service-telemetry-operator-bundle-nightly-head" Feb 23 00:20:51 crc kubenswrapper[4735]: I0223 00:20:51.147307 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45qrx\" (UniqueName: \"kubernetes.io/projected/266499b6-c905-4095-8f72-afbb761a4ec8-kube-api-access-45qrx\") pod \"awatch-operators-service-telemetry-operator-bundle-nightly-head\" (UID: \"266499b6-c905-4095-8f72-afbb761a4ec8\") " pod="service-telemetry/awatch-operators-service-telemetry-operator-bundle-nightly-head" Feb 23 00:20:51 crc kubenswrapper[4735]: I0223 00:20:51.318352 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/awatch-operators-service-telemetry-operator-bundle-nightly-head" Feb 23 00:20:51 crc kubenswrapper[4735]: I0223 00:20:51.633281 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/awatch-operators-service-telemetry-operator-bundle-nightly-head"] Feb 23 00:20:52 crc kubenswrapper[4735]: I0223 00:20:52.148237 4735 generic.go:334] "Generic (PLEG): container finished" podID="266499b6-c905-4095-8f72-afbb761a4ec8" containerID="a871dae5fc6d9c01e23313bc8b62d8f5bb145505590c1e31e5318cffc1a7b504" exitCode=0 Feb 23 00:20:52 crc kubenswrapper[4735]: I0223 00:20:52.148502 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/awatch-operators-service-telemetry-operator-bundle-nightly-head" event={"ID":"266499b6-c905-4095-8f72-afbb761a4ec8","Type":"ContainerDied","Data":"a871dae5fc6d9c01e23313bc8b62d8f5bb145505590c1e31e5318cffc1a7b504"} Feb 23 00:20:52 crc kubenswrapper[4735]: I0223 00:20:52.148759 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/awatch-operators-service-telemetry-operator-bundle-nightly-head" event={"ID":"266499b6-c905-4095-8f72-afbb761a4ec8","Type":"ContainerStarted","Data":"99c6dbe734ec41e89ebee0142ac9d8eaa58fd834b7c2f2c34f7d0f697fca08d0"} Feb 23 00:20:53 crc kubenswrapper[4735]: I0223 00:20:53.159413 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/awatch-operators-service-telemetry-operator-bundle-nightly-head" event={"ID":"266499b6-c905-4095-8f72-afbb761a4ec8","Type":"ContainerStarted","Data":"43519be736e1b584f06a28777264e122cd12348e214063e9f7b637885919cbdd"} Feb 23 00:20:53 crc kubenswrapper[4735]: I0223 00:20:53.190744 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/awatch-operators-service-telemetry-operator-bundle-nightly-head" podStartSLOduration=2.717665646 podStartE2EDuration="3.190720062s" podCreationTimestamp="2026-02-23 00:20:50 +0000 UTC" firstStartedPulling="2026-02-23 00:20:52.151127471 +0000 UTC m=+810.614673472" lastFinishedPulling="2026-02-23 00:20:52.624181887 +0000 UTC m=+811.087727888" observedRunningTime="2026-02-23 00:20:53.184905742 +0000 UTC m=+811.648451783" watchObservedRunningTime="2026-02-23 00:20:53.190720062 +0000 UTC m=+811.654266063" Feb 23 00:20:54 crc kubenswrapper[4735]: I0223 00:20:54.011637 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/59d91eeadfbc177692af3c8c1571c9d473bd01e833d0373cf802b3d572nmwh6"] Feb 23 00:20:54 crc kubenswrapper[4735]: I0223 00:20:54.013338 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/59d91eeadfbc177692af3c8c1571c9d473bd01e833d0373cf802b3d572nmwh6" Feb 23 00:20:54 crc kubenswrapper[4735]: I0223 00:20:54.031273 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/59d91eeadfbc177692af3c8c1571c9d473bd01e833d0373cf802b3d572nmwh6"] Feb 23 00:20:54 crc kubenswrapper[4735]: I0223 00:20:54.054981 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e5eb56b-e719-4b9c-8e3b-47910c2d7dab-util\") pod \"59d91eeadfbc177692af3c8c1571c9d473bd01e833d0373cf802b3d572nmwh6\" (UID: \"7e5eb56b-e719-4b9c-8e3b-47910c2d7dab\") " pod="service-telemetry/59d91eeadfbc177692af3c8c1571c9d473bd01e833d0373cf802b3d572nmwh6" Feb 23 00:20:54 crc kubenswrapper[4735]: I0223 00:20:54.055074 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvks9\" (UniqueName: \"kubernetes.io/projected/7e5eb56b-e719-4b9c-8e3b-47910c2d7dab-kube-api-access-gvks9\") pod \"59d91eeadfbc177692af3c8c1571c9d473bd01e833d0373cf802b3d572nmwh6\" (UID: \"7e5eb56b-e719-4b9c-8e3b-47910c2d7dab\") " pod="service-telemetry/59d91eeadfbc177692af3c8c1571c9d473bd01e833d0373cf802b3d572nmwh6" Feb 23 00:20:54 crc kubenswrapper[4735]: I0223 00:20:54.055122 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e5eb56b-e719-4b9c-8e3b-47910c2d7dab-bundle\") pod \"59d91eeadfbc177692af3c8c1571c9d473bd01e833d0373cf802b3d572nmwh6\" (UID: \"7e5eb56b-e719-4b9c-8e3b-47910c2d7dab\") " pod="service-telemetry/59d91eeadfbc177692af3c8c1571c9d473bd01e833d0373cf802b3d572nmwh6" Feb 23 00:20:54 crc kubenswrapper[4735]: I0223 00:20:54.156714 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e5eb56b-e719-4b9c-8e3b-47910c2d7dab-util\") pod \"59d91eeadfbc177692af3c8c1571c9d473bd01e833d0373cf802b3d572nmwh6\" (UID: \"7e5eb56b-e719-4b9c-8e3b-47910c2d7dab\") " pod="service-telemetry/59d91eeadfbc177692af3c8c1571c9d473bd01e833d0373cf802b3d572nmwh6" Feb 23 00:20:54 crc kubenswrapper[4735]: I0223 00:20:54.156825 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvks9\" (UniqueName: \"kubernetes.io/projected/7e5eb56b-e719-4b9c-8e3b-47910c2d7dab-kube-api-access-gvks9\") pod \"59d91eeadfbc177692af3c8c1571c9d473bd01e833d0373cf802b3d572nmwh6\" (UID: \"7e5eb56b-e719-4b9c-8e3b-47910c2d7dab\") " pod="service-telemetry/59d91eeadfbc177692af3c8c1571c9d473bd01e833d0373cf802b3d572nmwh6" Feb 23 00:20:54 crc kubenswrapper[4735]: I0223 00:20:54.156923 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e5eb56b-e719-4b9c-8e3b-47910c2d7dab-bundle\") pod \"59d91eeadfbc177692af3c8c1571c9d473bd01e833d0373cf802b3d572nmwh6\" (UID: \"7e5eb56b-e719-4b9c-8e3b-47910c2d7dab\") " pod="service-telemetry/59d91eeadfbc177692af3c8c1571c9d473bd01e833d0373cf802b3d572nmwh6" Feb 23 00:20:54 crc kubenswrapper[4735]: I0223 00:20:54.157597 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e5eb56b-e719-4b9c-8e3b-47910c2d7dab-util\") pod \"59d91eeadfbc177692af3c8c1571c9d473bd01e833d0373cf802b3d572nmwh6\" (UID: \"7e5eb56b-e719-4b9c-8e3b-47910c2d7dab\") " pod="service-telemetry/59d91eeadfbc177692af3c8c1571c9d473bd01e833d0373cf802b3d572nmwh6" Feb 23 00:20:54 crc kubenswrapper[4735]: I0223 00:20:54.157675 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e5eb56b-e719-4b9c-8e3b-47910c2d7dab-bundle\") pod \"59d91eeadfbc177692af3c8c1571c9d473bd01e833d0373cf802b3d572nmwh6\" (UID: \"7e5eb56b-e719-4b9c-8e3b-47910c2d7dab\") " pod="service-telemetry/59d91eeadfbc177692af3c8c1571c9d473bd01e833d0373cf802b3d572nmwh6" Feb 23 00:20:54 crc kubenswrapper[4735]: I0223 00:20:54.187212 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvks9\" (UniqueName: \"kubernetes.io/projected/7e5eb56b-e719-4b9c-8e3b-47910c2d7dab-kube-api-access-gvks9\") pod \"59d91eeadfbc177692af3c8c1571c9d473bd01e833d0373cf802b3d572nmwh6\" (UID: \"7e5eb56b-e719-4b9c-8e3b-47910c2d7dab\") " pod="service-telemetry/59d91eeadfbc177692af3c8c1571c9d473bd01e833d0373cf802b3d572nmwh6" Feb 23 00:20:54 crc kubenswrapper[4735]: I0223 00:20:54.339829 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/59d91eeadfbc177692af3c8c1571c9d473bd01e833d0373cf802b3d572nmwh6" Feb 23 00:20:54 crc kubenswrapper[4735]: I0223 00:20:54.458020 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct"] Feb 23 00:20:54 crc kubenswrapper[4735]: I0223 00:20:54.459765 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct" Feb 23 00:20:54 crc kubenswrapper[4735]: I0223 00:20:54.464209 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct"] Feb 23 00:20:54 crc kubenswrapper[4735]: I0223 00:20:54.466215 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 23 00:20:54 crc kubenswrapper[4735]: I0223 00:20:54.562962 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6b98c30-8554-4610-a645-727013065876-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct\" (UID: \"f6b98c30-8554-4610-a645-727013065876\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct" Feb 23 00:20:54 crc kubenswrapper[4735]: I0223 00:20:54.563000 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6b98c30-8554-4610-a645-727013065876-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct\" (UID: \"f6b98c30-8554-4610-a645-727013065876\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct" Feb 23 00:20:54 crc kubenswrapper[4735]: I0223 00:20:54.563248 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz56l\" (UniqueName: \"kubernetes.io/projected/f6b98c30-8554-4610-a645-727013065876-kube-api-access-bz56l\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct\" (UID: \"f6b98c30-8554-4610-a645-727013065876\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct" Feb 23 00:20:54 crc kubenswrapper[4735]: I0223 00:20:54.604086 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/59d91eeadfbc177692af3c8c1571c9d473bd01e833d0373cf802b3d572nmwh6"] Feb 23 00:20:54 crc kubenswrapper[4735]: W0223 00:20:54.613366 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e5eb56b_e719_4b9c_8e3b_47910c2d7dab.slice/crio-82b907d3f38ae2cc8f5a2b5adfe348b93061b06255737a59e822ea5a9781c43c WatchSource:0}: Error finding container 82b907d3f38ae2cc8f5a2b5adfe348b93061b06255737a59e822ea5a9781c43c: Status 404 returned error can't find the container with id 82b907d3f38ae2cc8f5a2b5adfe348b93061b06255737a59e822ea5a9781c43c Feb 23 00:20:54 crc kubenswrapper[4735]: I0223 00:20:54.665123 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6b98c30-8554-4610-a645-727013065876-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct\" (UID: \"f6b98c30-8554-4610-a645-727013065876\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct" Feb 23 00:20:54 crc kubenswrapper[4735]: I0223 00:20:54.665196 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6b98c30-8554-4610-a645-727013065876-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct\" (UID: \"f6b98c30-8554-4610-a645-727013065876\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct" Feb 23 00:20:54 crc kubenswrapper[4735]: I0223 00:20:54.665382 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz56l\" (UniqueName: \"kubernetes.io/projected/f6b98c30-8554-4610-a645-727013065876-kube-api-access-bz56l\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct\" (UID: \"f6b98c30-8554-4610-a645-727013065876\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct" Feb 23 00:20:54 crc kubenswrapper[4735]: I0223 00:20:54.665653 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6b98c30-8554-4610-a645-727013065876-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct\" (UID: \"f6b98c30-8554-4610-a645-727013065876\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct" Feb 23 00:20:54 crc kubenswrapper[4735]: I0223 00:20:54.665724 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6b98c30-8554-4610-a645-727013065876-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct\" (UID: \"f6b98c30-8554-4610-a645-727013065876\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct" Feb 23 00:20:54 crc kubenswrapper[4735]: I0223 00:20:54.687060 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz56l\" (UniqueName: \"kubernetes.io/projected/f6b98c30-8554-4610-a645-727013065876-kube-api-access-bz56l\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct\" (UID: \"f6b98c30-8554-4610-a645-727013065876\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct" Feb 23 00:20:54 crc kubenswrapper[4735]: I0223 00:20:54.786072 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct" Feb 23 00:20:55 crc kubenswrapper[4735]: I0223 00:20:55.000050 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct"] Feb 23 00:20:55 crc kubenswrapper[4735]: I0223 00:20:55.175826 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct" event={"ID":"f6b98c30-8554-4610-a645-727013065876","Type":"ContainerStarted","Data":"45c3ea9b518985c6759cac8b2d8b51112faf73384a6b1e76256ea59a9455b99b"} Feb 23 00:20:55 crc kubenswrapper[4735]: I0223 00:20:55.176232 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct" event={"ID":"f6b98c30-8554-4610-a645-727013065876","Type":"ContainerStarted","Data":"48d00910e11d2c96b5929b72c5ffe41f23d7f21fc1d5b59bf7c84b2db5f0495f"} Feb 23 00:20:55 crc kubenswrapper[4735]: I0223 00:20:55.178476 4735 generic.go:334] "Generic (PLEG): container finished" podID="7e5eb56b-e719-4b9c-8e3b-47910c2d7dab" containerID="e9660fce1a0d7548de40d37bfe539eaff264bf6108b38efc673a8953852f0080" exitCode=0 Feb 23 00:20:55 crc kubenswrapper[4735]: I0223 00:20:55.178531 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/59d91eeadfbc177692af3c8c1571c9d473bd01e833d0373cf802b3d572nmwh6" event={"ID":"7e5eb56b-e719-4b9c-8e3b-47910c2d7dab","Type":"ContainerDied","Data":"e9660fce1a0d7548de40d37bfe539eaff264bf6108b38efc673a8953852f0080"} Feb 23 00:20:55 crc kubenswrapper[4735]: I0223 00:20:55.178560 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/59d91eeadfbc177692af3c8c1571c9d473bd01e833d0373cf802b3d572nmwh6" event={"ID":"7e5eb56b-e719-4b9c-8e3b-47910c2d7dab","Type":"ContainerStarted","Data":"82b907d3f38ae2cc8f5a2b5adfe348b93061b06255737a59e822ea5a9781c43c"} Feb 23 00:20:56 crc kubenswrapper[4735]: I0223 00:20:56.192265 4735 generic.go:334] "Generic (PLEG): container finished" podID="f6b98c30-8554-4610-a645-727013065876" containerID="45c3ea9b518985c6759cac8b2d8b51112faf73384a6b1e76256ea59a9455b99b" exitCode=0 Feb 23 00:20:56 crc kubenswrapper[4735]: I0223 00:20:56.192341 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct" event={"ID":"f6b98c30-8554-4610-a645-727013065876","Type":"ContainerDied","Data":"45c3ea9b518985c6759cac8b2d8b51112faf73384a6b1e76256ea59a9455b99b"} Feb 23 00:20:57 crc kubenswrapper[4735]: I0223 00:20:57.229448 4735 generic.go:334] "Generic (PLEG): container finished" podID="7e5eb56b-e719-4b9c-8e3b-47910c2d7dab" containerID="58f3aa2a3792315ba68c96d7177bedbafbae5c9847a21b89d0624dbc5ec6adec" exitCode=0 Feb 23 00:20:57 crc kubenswrapper[4735]: I0223 00:20:57.229623 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/59d91eeadfbc177692af3c8c1571c9d473bd01e833d0373cf802b3d572nmwh6" event={"ID":"7e5eb56b-e719-4b9c-8e3b-47910c2d7dab","Type":"ContainerDied","Data":"58f3aa2a3792315ba68c96d7177bedbafbae5c9847a21b89d0624dbc5ec6adec"} Feb 23 00:20:57 crc kubenswrapper[4735]: I0223 00:20:57.962194 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-g7rfg"] Feb 23 00:20:57 crc kubenswrapper[4735]: I0223 00:20:57.964158 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g7rfg" Feb 23 00:20:57 crc kubenswrapper[4735]: I0223 00:20:57.992127 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g7rfg"] Feb 23 00:20:58 crc kubenswrapper[4735]: I0223 00:20:58.125005 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5gvn\" (UniqueName: \"kubernetes.io/projected/53b30bf2-54de-46f8-a567-d5b8b49db40a-kube-api-access-c5gvn\") pod \"redhat-operators-g7rfg\" (UID: \"53b30bf2-54de-46f8-a567-d5b8b49db40a\") " pod="openshift-marketplace/redhat-operators-g7rfg" Feb 23 00:20:58 crc kubenswrapper[4735]: I0223 00:20:58.125243 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53b30bf2-54de-46f8-a567-d5b8b49db40a-utilities\") pod \"redhat-operators-g7rfg\" (UID: \"53b30bf2-54de-46f8-a567-d5b8b49db40a\") " pod="openshift-marketplace/redhat-operators-g7rfg" Feb 23 00:20:58 crc kubenswrapper[4735]: I0223 00:20:58.125428 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53b30bf2-54de-46f8-a567-d5b8b49db40a-catalog-content\") pod \"redhat-operators-g7rfg\" (UID: \"53b30bf2-54de-46f8-a567-d5b8b49db40a\") " pod="openshift-marketplace/redhat-operators-g7rfg" Feb 23 00:20:58 crc kubenswrapper[4735]: I0223 00:20:58.226746 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53b30bf2-54de-46f8-a567-d5b8b49db40a-catalog-content\") pod \"redhat-operators-g7rfg\" (UID: \"53b30bf2-54de-46f8-a567-d5b8b49db40a\") " pod="openshift-marketplace/redhat-operators-g7rfg" Feb 23 00:20:58 crc kubenswrapper[4735]: I0223 00:20:58.227257 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5gvn\" (UniqueName: \"kubernetes.io/projected/53b30bf2-54de-46f8-a567-d5b8b49db40a-kube-api-access-c5gvn\") pod \"redhat-operators-g7rfg\" (UID: \"53b30bf2-54de-46f8-a567-d5b8b49db40a\") " pod="openshift-marketplace/redhat-operators-g7rfg" Feb 23 00:20:58 crc kubenswrapper[4735]: I0223 00:20:58.227494 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53b30bf2-54de-46f8-a567-d5b8b49db40a-utilities\") pod \"redhat-operators-g7rfg\" (UID: \"53b30bf2-54de-46f8-a567-d5b8b49db40a\") " pod="openshift-marketplace/redhat-operators-g7rfg" Feb 23 00:20:58 crc kubenswrapper[4735]: I0223 00:20:58.227681 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53b30bf2-54de-46f8-a567-d5b8b49db40a-catalog-content\") pod \"redhat-operators-g7rfg\" (UID: \"53b30bf2-54de-46f8-a567-d5b8b49db40a\") " pod="openshift-marketplace/redhat-operators-g7rfg" Feb 23 00:20:58 crc kubenswrapper[4735]: I0223 00:20:58.227978 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53b30bf2-54de-46f8-a567-d5b8b49db40a-utilities\") pod \"redhat-operators-g7rfg\" (UID: \"53b30bf2-54de-46f8-a567-d5b8b49db40a\") " pod="openshift-marketplace/redhat-operators-g7rfg" Feb 23 00:20:58 crc kubenswrapper[4735]: I0223 00:20:58.239966 4735 generic.go:334] "Generic (PLEG): container finished" podID="f6b98c30-8554-4610-a645-727013065876" containerID="a9e28d4b5c52493b9042c46125cf04bcee4c615a45e8d4bfc362d9ae3715fca1" exitCode=0 Feb 23 00:20:58 crc kubenswrapper[4735]: I0223 00:20:58.240080 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct" event={"ID":"f6b98c30-8554-4610-a645-727013065876","Type":"ContainerDied","Data":"a9e28d4b5c52493b9042c46125cf04bcee4c615a45e8d4bfc362d9ae3715fca1"} Feb 23 00:20:58 crc kubenswrapper[4735]: I0223 00:20:58.244889 4735 generic.go:334] "Generic (PLEG): container finished" podID="7e5eb56b-e719-4b9c-8e3b-47910c2d7dab" containerID="8eae14c9f764df13c2382b7c1c5bec51ca974a631764019ac03d76225413bc6e" exitCode=0 Feb 23 00:20:58 crc kubenswrapper[4735]: I0223 00:20:58.244933 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/59d91eeadfbc177692af3c8c1571c9d473bd01e833d0373cf802b3d572nmwh6" event={"ID":"7e5eb56b-e719-4b9c-8e3b-47910c2d7dab","Type":"ContainerDied","Data":"8eae14c9f764df13c2382b7c1c5bec51ca974a631764019ac03d76225413bc6e"} Feb 23 00:20:58 crc kubenswrapper[4735]: I0223 00:20:58.260607 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5gvn\" (UniqueName: \"kubernetes.io/projected/53b30bf2-54de-46f8-a567-d5b8b49db40a-kube-api-access-c5gvn\") pod \"redhat-operators-g7rfg\" (UID: \"53b30bf2-54de-46f8-a567-d5b8b49db40a\") " pod="openshift-marketplace/redhat-operators-g7rfg" Feb 23 00:20:58 crc kubenswrapper[4735]: I0223 00:20:58.297456 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g7rfg" Feb 23 00:20:58 crc kubenswrapper[4735]: I0223 00:20:58.780277 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g7rfg"] Feb 23 00:20:58 crc kubenswrapper[4735]: W0223 00:20:58.788234 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53b30bf2_54de_46f8_a567_d5b8b49db40a.slice/crio-198d90cdbbe693279c1c3317c5eaa9d691d7cab7de59c08b134879aa22164e8d WatchSource:0}: Error finding container 198d90cdbbe693279c1c3317c5eaa9d691d7cab7de59c08b134879aa22164e8d: Status 404 returned error can't find the container with id 198d90cdbbe693279c1c3317c5eaa9d691d7cab7de59c08b134879aa22164e8d Feb 23 00:20:59 crc kubenswrapper[4735]: I0223 00:20:59.256165 4735 generic.go:334] "Generic (PLEG): container finished" podID="f6b98c30-8554-4610-a645-727013065876" containerID="fbb774a18fc7de333a60042dec2a549240caead5c9cb6c7a3235f0189949de26" exitCode=0 Feb 23 00:20:59 crc kubenswrapper[4735]: I0223 00:20:59.256213 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct" event={"ID":"f6b98c30-8554-4610-a645-727013065876","Type":"ContainerDied","Data":"fbb774a18fc7de333a60042dec2a549240caead5c9cb6c7a3235f0189949de26"} Feb 23 00:20:59 crc kubenswrapper[4735]: I0223 00:20:59.258109 4735 generic.go:334] "Generic (PLEG): container finished" podID="53b30bf2-54de-46f8-a567-d5b8b49db40a" containerID="0edf3c3ae8c3fb27d97db756db9fc94301632e9e3dd49613d9d2e359a798756c" exitCode=0 Feb 23 00:20:59 crc kubenswrapper[4735]: I0223 00:20:59.258157 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7rfg" event={"ID":"53b30bf2-54de-46f8-a567-d5b8b49db40a","Type":"ContainerDied","Data":"0edf3c3ae8c3fb27d97db756db9fc94301632e9e3dd49613d9d2e359a798756c"} Feb 23 00:20:59 crc kubenswrapper[4735]: I0223 00:20:59.258190 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7rfg" event={"ID":"53b30bf2-54de-46f8-a567-d5b8b49db40a","Type":"ContainerStarted","Data":"198d90cdbbe693279c1c3317c5eaa9d691d7cab7de59c08b134879aa22164e8d"} Feb 23 00:20:59 crc kubenswrapper[4735]: I0223 00:20:59.529311 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/59d91eeadfbc177692af3c8c1571c9d473bd01e833d0373cf802b3d572nmwh6" Feb 23 00:20:59 crc kubenswrapper[4735]: I0223 00:20:59.660195 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e5eb56b-e719-4b9c-8e3b-47910c2d7dab-util\") pod \"7e5eb56b-e719-4b9c-8e3b-47910c2d7dab\" (UID: \"7e5eb56b-e719-4b9c-8e3b-47910c2d7dab\") " Feb 23 00:20:59 crc kubenswrapper[4735]: I0223 00:20:59.660330 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e5eb56b-e719-4b9c-8e3b-47910c2d7dab-bundle\") pod \"7e5eb56b-e719-4b9c-8e3b-47910c2d7dab\" (UID: \"7e5eb56b-e719-4b9c-8e3b-47910c2d7dab\") " Feb 23 00:20:59 crc kubenswrapper[4735]: I0223 00:20:59.660410 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvks9\" (UniqueName: \"kubernetes.io/projected/7e5eb56b-e719-4b9c-8e3b-47910c2d7dab-kube-api-access-gvks9\") pod \"7e5eb56b-e719-4b9c-8e3b-47910c2d7dab\" (UID: \"7e5eb56b-e719-4b9c-8e3b-47910c2d7dab\") " Feb 23 00:20:59 crc kubenswrapper[4735]: I0223 00:20:59.661346 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e5eb56b-e719-4b9c-8e3b-47910c2d7dab-bundle" (OuterVolumeSpecName: "bundle") pod "7e5eb56b-e719-4b9c-8e3b-47910c2d7dab" (UID: "7e5eb56b-e719-4b9c-8e3b-47910c2d7dab"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:20:59 crc kubenswrapper[4735]: I0223 00:20:59.665743 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e5eb56b-e719-4b9c-8e3b-47910c2d7dab-kube-api-access-gvks9" (OuterVolumeSpecName: "kube-api-access-gvks9") pod "7e5eb56b-e719-4b9c-8e3b-47910c2d7dab" (UID: "7e5eb56b-e719-4b9c-8e3b-47910c2d7dab"). InnerVolumeSpecName "kube-api-access-gvks9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:20:59 crc kubenswrapper[4735]: I0223 00:20:59.687989 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e5eb56b-e719-4b9c-8e3b-47910c2d7dab-util" (OuterVolumeSpecName: "util") pod "7e5eb56b-e719-4b9c-8e3b-47910c2d7dab" (UID: "7e5eb56b-e719-4b9c-8e3b-47910c2d7dab"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:20:59 crc kubenswrapper[4735]: I0223 00:20:59.767335 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvks9\" (UniqueName: \"kubernetes.io/projected/7e5eb56b-e719-4b9c-8e3b-47910c2d7dab-kube-api-access-gvks9\") on node \"crc\" DevicePath \"\"" Feb 23 00:20:59 crc kubenswrapper[4735]: I0223 00:20:59.767388 4735 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e5eb56b-e719-4b9c-8e3b-47910c2d7dab-util\") on node \"crc\" DevicePath \"\"" Feb 23 00:20:59 crc kubenswrapper[4735]: I0223 00:20:59.767400 4735 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e5eb56b-e719-4b9c-8e3b-47910c2d7dab-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 00:21:00 crc kubenswrapper[4735]: I0223 00:21:00.269823 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7rfg" event={"ID":"53b30bf2-54de-46f8-a567-d5b8b49db40a","Type":"ContainerStarted","Data":"d4d5d9258fdb1006c525b2f59151d95c1577e68302a300983a714d0f1aae6dde"} Feb 23 00:21:00 crc kubenswrapper[4735]: I0223 00:21:00.277964 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/59d91eeadfbc177692af3c8c1571c9d473bd01e833d0373cf802b3d572nmwh6" Feb 23 00:21:00 crc kubenswrapper[4735]: I0223 00:21:00.303037 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/59d91eeadfbc177692af3c8c1571c9d473bd01e833d0373cf802b3d572nmwh6" event={"ID":"7e5eb56b-e719-4b9c-8e3b-47910c2d7dab","Type":"ContainerDied","Data":"82b907d3f38ae2cc8f5a2b5adfe348b93061b06255737a59e822ea5a9781c43c"} Feb 23 00:21:00 crc kubenswrapper[4735]: I0223 00:21:00.303092 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82b907d3f38ae2cc8f5a2b5adfe348b93061b06255737a59e822ea5a9781c43c" Feb 23 00:21:00 crc kubenswrapper[4735]: I0223 00:21:00.599411 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct" Feb 23 00:21:00 crc kubenswrapper[4735]: I0223 00:21:00.798339 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6b98c30-8554-4610-a645-727013065876-bundle\") pod \"f6b98c30-8554-4610-a645-727013065876\" (UID: \"f6b98c30-8554-4610-a645-727013065876\") " Feb 23 00:21:00 crc kubenswrapper[4735]: I0223 00:21:00.798773 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6b98c30-8554-4610-a645-727013065876-util\") pod \"f6b98c30-8554-4610-a645-727013065876\" (UID: \"f6b98c30-8554-4610-a645-727013065876\") " Feb 23 00:21:00 crc kubenswrapper[4735]: I0223 00:21:00.798844 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz56l\" (UniqueName: \"kubernetes.io/projected/f6b98c30-8554-4610-a645-727013065876-kube-api-access-bz56l\") pod \"f6b98c30-8554-4610-a645-727013065876\" (UID: \"f6b98c30-8554-4610-a645-727013065876\") " Feb 23 00:21:00 crc kubenswrapper[4735]: I0223 00:21:00.799262 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6b98c30-8554-4610-a645-727013065876-bundle" (OuterVolumeSpecName: "bundle") pod "f6b98c30-8554-4610-a645-727013065876" (UID: "f6b98c30-8554-4610-a645-727013065876"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:21:00 crc kubenswrapper[4735]: I0223 00:21:00.805144 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6b98c30-8554-4610-a645-727013065876-kube-api-access-bz56l" (OuterVolumeSpecName: "kube-api-access-bz56l") pod "f6b98c30-8554-4610-a645-727013065876" (UID: "f6b98c30-8554-4610-a645-727013065876"). InnerVolumeSpecName "kube-api-access-bz56l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:21:00 crc kubenswrapper[4735]: I0223 00:21:00.889129 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6b98c30-8554-4610-a645-727013065876-util" (OuterVolumeSpecName: "util") pod "f6b98c30-8554-4610-a645-727013065876" (UID: "f6b98c30-8554-4610-a645-727013065876"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:21:00 crc kubenswrapper[4735]: I0223 00:21:00.900324 4735 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6b98c30-8554-4610-a645-727013065876-util\") on node \"crc\" DevicePath \"\"" Feb 23 00:21:00 crc kubenswrapper[4735]: I0223 00:21:00.900374 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz56l\" (UniqueName: \"kubernetes.io/projected/f6b98c30-8554-4610-a645-727013065876-kube-api-access-bz56l\") on node \"crc\" DevicePath \"\"" Feb 23 00:21:00 crc kubenswrapper[4735]: I0223 00:21:00.900389 4735 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6b98c30-8554-4610-a645-727013065876-bundle\") on node \"crc\" DevicePath \"\"" Feb 23 00:21:01 crc kubenswrapper[4735]: I0223 00:21:01.294414 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct" event={"ID":"f6b98c30-8554-4610-a645-727013065876","Type":"ContainerDied","Data":"48d00910e11d2c96b5929b72c5ffe41f23d7f21fc1d5b59bf7c84b2db5f0495f"} Feb 23 00:21:01 crc kubenswrapper[4735]: I0223 00:21:01.294485 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48d00910e11d2c96b5929b72c5ffe41f23d7f21fc1d5b59bf7c84b2db5f0495f" Feb 23 00:21:01 crc kubenswrapper[4735]: I0223 00:21:01.294423 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct" Feb 23 00:21:01 crc kubenswrapper[4735]: I0223 00:21:01.298078 4735 generic.go:334] "Generic (PLEG): container finished" podID="53b30bf2-54de-46f8-a567-d5b8b49db40a" containerID="d4d5d9258fdb1006c525b2f59151d95c1577e68302a300983a714d0f1aae6dde" exitCode=0 Feb 23 00:21:01 crc kubenswrapper[4735]: I0223 00:21:01.298149 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7rfg" event={"ID":"53b30bf2-54de-46f8-a567-d5b8b49db40a","Type":"ContainerDied","Data":"d4d5d9258fdb1006c525b2f59151d95c1577e68302a300983a714d0f1aae6dde"} Feb 23 00:21:02 crc kubenswrapper[4735]: I0223 00:21:02.310481 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7rfg" event={"ID":"53b30bf2-54de-46f8-a567-d5b8b49db40a","Type":"ContainerStarted","Data":"e21db7adad795135838fd82bd194832136b9f46ac91e721b43a5ca4e2123d9dd"} Feb 23 00:21:02 crc kubenswrapper[4735]: I0223 00:21:02.345768 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-g7rfg" podStartSLOduration=2.751615245 podStartE2EDuration="5.345739576s" podCreationTimestamp="2026-02-23 00:20:57 +0000 UTC" firstStartedPulling="2026-02-23 00:20:59.259762799 +0000 UTC m=+817.723308810" lastFinishedPulling="2026-02-23 00:21:01.85388713 +0000 UTC m=+820.317433141" observedRunningTime="2026-02-23 00:21:02.337846047 +0000 UTC m=+820.801392078" watchObservedRunningTime="2026-02-23 00:21:02.345739576 +0000 UTC m=+820.809285587" Feb 23 00:21:08 crc kubenswrapper[4735]: I0223 00:21:08.298201 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-g7rfg" Feb 23 00:21:08 crc kubenswrapper[4735]: I0223 00:21:08.298655 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-g7rfg" Feb 23 00:21:09 crc kubenswrapper[4735]: I0223 00:21:09.352503 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-g7rfg" podUID="53b30bf2-54de-46f8-a567-d5b8b49db40a" containerName="registry-server" probeResult="failure" output=< Feb 23 00:21:09 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Feb 23 00:21:09 crc kubenswrapper[4735]: > Feb 23 00:21:09 crc kubenswrapper[4735]: I0223 00:21:09.589574 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-sdf47"] Feb 23 00:21:09 crc kubenswrapper[4735]: E0223 00:21:09.589838 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e5eb56b-e719-4b9c-8e3b-47910c2d7dab" containerName="util" Feb 23 00:21:09 crc kubenswrapper[4735]: I0223 00:21:09.589904 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e5eb56b-e719-4b9c-8e3b-47910c2d7dab" containerName="util" Feb 23 00:21:09 crc kubenswrapper[4735]: E0223 00:21:09.589925 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6b98c30-8554-4610-a645-727013065876" containerName="util" Feb 23 00:21:09 crc kubenswrapper[4735]: I0223 00:21:09.589933 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6b98c30-8554-4610-a645-727013065876" containerName="util" Feb 23 00:21:09 crc kubenswrapper[4735]: E0223 00:21:09.589943 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e5eb56b-e719-4b9c-8e3b-47910c2d7dab" containerName="pull" Feb 23 00:21:09 crc kubenswrapper[4735]: I0223 00:21:09.589950 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e5eb56b-e719-4b9c-8e3b-47910c2d7dab" containerName="pull" Feb 23 00:21:09 crc kubenswrapper[4735]: E0223 00:21:09.589959 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6b98c30-8554-4610-a645-727013065876" containerName="pull" Feb 23 00:21:09 crc kubenswrapper[4735]: I0223 00:21:09.589965 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6b98c30-8554-4610-a645-727013065876" containerName="pull" Feb 23 00:21:09 crc kubenswrapper[4735]: E0223 00:21:09.589974 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6b98c30-8554-4610-a645-727013065876" containerName="extract" Feb 23 00:21:09 crc kubenswrapper[4735]: I0223 00:21:09.589981 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6b98c30-8554-4610-a645-727013065876" containerName="extract" Feb 23 00:21:09 crc kubenswrapper[4735]: E0223 00:21:09.589992 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e5eb56b-e719-4b9c-8e3b-47910c2d7dab" containerName="extract" Feb 23 00:21:09 crc kubenswrapper[4735]: I0223 00:21:09.589998 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e5eb56b-e719-4b9c-8e3b-47910c2d7dab" containerName="extract" Feb 23 00:21:09 crc kubenswrapper[4735]: I0223 00:21:09.590111 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6b98c30-8554-4610-a645-727013065876" containerName="extract" Feb 23 00:21:09 crc kubenswrapper[4735]: I0223 00:21:09.590133 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e5eb56b-e719-4b9c-8e3b-47910c2d7dab" containerName="extract" Feb 23 00:21:09 crc kubenswrapper[4735]: I0223 00:21:09.590570 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-sdf47" Feb 23 00:21:09 crc kubenswrapper[4735]: I0223 00:21:09.595020 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"interconnect-operator-dockercfg-h6qp2" Feb 23 00:21:09 crc kubenswrapper[4735]: I0223 00:21:09.603410 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-sdf47"] Feb 23 00:21:09 crc kubenswrapper[4735]: I0223 00:21:09.730164 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnsnb\" (UniqueName: \"kubernetes.io/projected/75aaacd1-9b98-4992-9f17-c88d3f3392ec-kube-api-access-rnsnb\") pod \"interconnect-operator-5bb49f789d-sdf47\" (UID: \"75aaacd1-9b98-4992-9f17-c88d3f3392ec\") " pod="service-telemetry/interconnect-operator-5bb49f789d-sdf47" Feb 23 00:21:09 crc kubenswrapper[4735]: I0223 00:21:09.831371 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnsnb\" (UniqueName: \"kubernetes.io/projected/75aaacd1-9b98-4992-9f17-c88d3f3392ec-kube-api-access-rnsnb\") pod \"interconnect-operator-5bb49f789d-sdf47\" (UID: \"75aaacd1-9b98-4992-9f17-c88d3f3392ec\") " pod="service-telemetry/interconnect-operator-5bb49f789d-sdf47" Feb 23 00:21:09 crc kubenswrapper[4735]: I0223 00:21:09.854614 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnsnb\" (UniqueName: \"kubernetes.io/projected/75aaacd1-9b98-4992-9f17-c88d3f3392ec-kube-api-access-rnsnb\") pod \"interconnect-operator-5bb49f789d-sdf47\" (UID: \"75aaacd1-9b98-4992-9f17-c88d3f3392ec\") " pod="service-telemetry/interconnect-operator-5bb49f789d-sdf47" Feb 23 00:21:09 crc kubenswrapper[4735]: I0223 00:21:09.909151 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-sdf47" Feb 23 00:21:10 crc kubenswrapper[4735]: I0223 00:21:10.128934 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-sdf47"] Feb 23 00:21:10 crc kubenswrapper[4735]: I0223 00:21:10.376830 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-sdf47" event={"ID":"75aaacd1-9b98-4992-9f17-c88d3f3392ec","Type":"ContainerStarted","Data":"24683a903962d2a9a1ce1a7c6bc993a667a06c106080ed36ad0cf9393a903487"} Feb 23 00:21:12 crc kubenswrapper[4735]: I0223 00:21:12.785786 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-55b89ddfb9-frnfs"] Feb 23 00:21:12 crc kubenswrapper[4735]: I0223 00:21:12.787612 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-55b89ddfb9-frnfs" Feb 23 00:21:12 crc kubenswrapper[4735]: I0223 00:21:12.790068 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-operator-dockercfg-csjqw" Feb 23 00:21:12 crc kubenswrapper[4735]: I0223 00:21:12.797063 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-55b89ddfb9-frnfs"] Feb 23 00:21:12 crc kubenswrapper[4735]: I0223 00:21:12.881467 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/8be8f82e-4edc-4812-a4d0-2e3dda7788bf-runner\") pod \"service-telemetry-operator-55b89ddfb9-frnfs\" (UID: \"8be8f82e-4edc-4812-a4d0-2e3dda7788bf\") " pod="service-telemetry/service-telemetry-operator-55b89ddfb9-frnfs" Feb 23 00:21:12 crc kubenswrapper[4735]: I0223 00:21:12.881566 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6nwq\" (UniqueName: \"kubernetes.io/projected/8be8f82e-4edc-4812-a4d0-2e3dda7788bf-kube-api-access-m6nwq\") pod \"service-telemetry-operator-55b89ddfb9-frnfs\" (UID: \"8be8f82e-4edc-4812-a4d0-2e3dda7788bf\") " pod="service-telemetry/service-telemetry-operator-55b89ddfb9-frnfs" Feb 23 00:21:12 crc kubenswrapper[4735]: I0223 00:21:12.983355 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/8be8f82e-4edc-4812-a4d0-2e3dda7788bf-runner\") pod \"service-telemetry-operator-55b89ddfb9-frnfs\" (UID: \"8be8f82e-4edc-4812-a4d0-2e3dda7788bf\") " pod="service-telemetry/service-telemetry-operator-55b89ddfb9-frnfs" Feb 23 00:21:12 crc kubenswrapper[4735]: I0223 00:21:12.983485 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6nwq\" (UniqueName: \"kubernetes.io/projected/8be8f82e-4edc-4812-a4d0-2e3dda7788bf-kube-api-access-m6nwq\") pod \"service-telemetry-operator-55b89ddfb9-frnfs\" (UID: \"8be8f82e-4edc-4812-a4d0-2e3dda7788bf\") " pod="service-telemetry/service-telemetry-operator-55b89ddfb9-frnfs" Feb 23 00:21:12 crc kubenswrapper[4735]: I0223 00:21:12.984089 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/8be8f82e-4edc-4812-a4d0-2e3dda7788bf-runner\") pod \"service-telemetry-operator-55b89ddfb9-frnfs\" (UID: \"8be8f82e-4edc-4812-a4d0-2e3dda7788bf\") " pod="service-telemetry/service-telemetry-operator-55b89ddfb9-frnfs" Feb 23 00:21:13 crc kubenswrapper[4735]: I0223 00:21:13.018953 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6nwq\" (UniqueName: \"kubernetes.io/projected/8be8f82e-4edc-4812-a4d0-2e3dda7788bf-kube-api-access-m6nwq\") pod \"service-telemetry-operator-55b89ddfb9-frnfs\" (UID: \"8be8f82e-4edc-4812-a4d0-2e3dda7788bf\") " pod="service-telemetry/service-telemetry-operator-55b89ddfb9-frnfs" Feb 23 00:21:13 crc kubenswrapper[4735]: I0223 00:21:13.107469 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-55b89ddfb9-frnfs" Feb 23 00:21:13 crc kubenswrapper[4735]: I0223 00:21:13.522162 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-55b89ddfb9-frnfs"] Feb 23 00:21:17 crc kubenswrapper[4735]: I0223 00:21:17.434301 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-sdf47" event={"ID":"75aaacd1-9b98-4992-9f17-c88d3f3392ec","Type":"ContainerStarted","Data":"81614491837a1b4d487d8cab9b349ddf1f44846544d2791c4c36a11332e0a997"} Feb 23 00:21:17 crc kubenswrapper[4735]: I0223 00:21:17.437206 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-55b89ddfb9-frnfs" event={"ID":"8be8f82e-4edc-4812-a4d0-2e3dda7788bf","Type":"ContainerStarted","Data":"b2ab3aafbd637ac7942d370af1d6e57579b337c8422bcbedaae80ff73e0de7ed"} Feb 23 00:21:17 crc kubenswrapper[4735]: I0223 00:21:17.457254 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/interconnect-operator-5bb49f789d-sdf47" podStartSLOduration=2.018954803 podStartE2EDuration="8.457223649s" podCreationTimestamp="2026-02-23 00:21:09 +0000 UTC" firstStartedPulling="2026-02-23 00:21:10.14031481 +0000 UTC m=+828.603860781" lastFinishedPulling="2026-02-23 00:21:16.578583656 +0000 UTC m=+835.042129627" observedRunningTime="2026-02-23 00:21:17.450357145 +0000 UTC m=+835.913903146" watchObservedRunningTime="2026-02-23 00:21:17.457223649 +0000 UTC m=+835.920769660" Feb 23 00:21:18 crc kubenswrapper[4735]: I0223 00:21:18.390707 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-g7rfg" Feb 23 00:21:18 crc kubenswrapper[4735]: I0223 00:21:18.463922 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-g7rfg" Feb 23 00:21:21 crc kubenswrapper[4735]: I0223 00:21:21.957947 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g7rfg"] Feb 23 00:21:21 crc kubenswrapper[4735]: I0223 00:21:21.958293 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-g7rfg" podUID="53b30bf2-54de-46f8-a567-d5b8b49db40a" containerName="registry-server" containerID="cri-o://e21db7adad795135838fd82bd194832136b9f46ac91e721b43a5ca4e2123d9dd" gracePeriod=2 Feb 23 00:21:22 crc kubenswrapper[4735]: I0223 00:21:22.476709 4735 generic.go:334] "Generic (PLEG): container finished" podID="53b30bf2-54de-46f8-a567-d5b8b49db40a" containerID="e21db7adad795135838fd82bd194832136b9f46ac91e721b43a5ca4e2123d9dd" exitCode=0 Feb 23 00:21:22 crc kubenswrapper[4735]: I0223 00:21:22.476812 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7rfg" event={"ID":"53b30bf2-54de-46f8-a567-d5b8b49db40a","Type":"ContainerDied","Data":"e21db7adad795135838fd82bd194832136b9f46ac91e721b43a5ca4e2123d9dd"} Feb 23 00:21:22 crc kubenswrapper[4735]: I0223 00:21:22.614553 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g7rfg" Feb 23 00:21:22 crc kubenswrapper[4735]: I0223 00:21:22.711262 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53b30bf2-54de-46f8-a567-d5b8b49db40a-utilities\") pod \"53b30bf2-54de-46f8-a567-d5b8b49db40a\" (UID: \"53b30bf2-54de-46f8-a567-d5b8b49db40a\") " Feb 23 00:21:22 crc kubenswrapper[4735]: I0223 00:21:22.711321 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5gvn\" (UniqueName: \"kubernetes.io/projected/53b30bf2-54de-46f8-a567-d5b8b49db40a-kube-api-access-c5gvn\") pod \"53b30bf2-54de-46f8-a567-d5b8b49db40a\" (UID: \"53b30bf2-54de-46f8-a567-d5b8b49db40a\") " Feb 23 00:21:22 crc kubenswrapper[4735]: I0223 00:21:22.711508 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53b30bf2-54de-46f8-a567-d5b8b49db40a-catalog-content\") pod \"53b30bf2-54de-46f8-a567-d5b8b49db40a\" (UID: \"53b30bf2-54de-46f8-a567-d5b8b49db40a\") " Feb 23 00:21:22 crc kubenswrapper[4735]: I0223 00:21:22.712926 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53b30bf2-54de-46f8-a567-d5b8b49db40a-utilities" (OuterVolumeSpecName: "utilities") pod "53b30bf2-54de-46f8-a567-d5b8b49db40a" (UID: "53b30bf2-54de-46f8-a567-d5b8b49db40a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:21:22 crc kubenswrapper[4735]: I0223 00:21:22.724093 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53b30bf2-54de-46f8-a567-d5b8b49db40a-kube-api-access-c5gvn" (OuterVolumeSpecName: "kube-api-access-c5gvn") pod "53b30bf2-54de-46f8-a567-d5b8b49db40a" (UID: "53b30bf2-54de-46f8-a567-d5b8b49db40a"). InnerVolumeSpecName "kube-api-access-c5gvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:21:22 crc kubenswrapper[4735]: I0223 00:21:22.817253 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53b30bf2-54de-46f8-a567-d5b8b49db40a-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 00:21:22 crc kubenswrapper[4735]: I0223 00:21:22.817308 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5gvn\" (UniqueName: \"kubernetes.io/projected/53b30bf2-54de-46f8-a567-d5b8b49db40a-kube-api-access-c5gvn\") on node \"crc\" DevicePath \"\"" Feb 23 00:21:22 crc kubenswrapper[4735]: I0223 00:21:22.860045 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53b30bf2-54de-46f8-a567-d5b8b49db40a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "53b30bf2-54de-46f8-a567-d5b8b49db40a" (UID: "53b30bf2-54de-46f8-a567-d5b8b49db40a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:21:22 crc kubenswrapper[4735]: I0223 00:21:22.919163 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53b30bf2-54de-46f8-a567-d5b8b49db40a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 00:21:23 crc kubenswrapper[4735]: I0223 00:21:23.489467 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-55b89ddfb9-frnfs" event={"ID":"8be8f82e-4edc-4812-a4d0-2e3dda7788bf","Type":"ContainerStarted","Data":"07ce0bbb13db4be8f67c0b945cdcc9d33b779b570e64d7d3080ebb1e034ca0c5"} Feb 23 00:21:23 crc kubenswrapper[4735]: I0223 00:21:23.494101 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g7rfg" event={"ID":"53b30bf2-54de-46f8-a567-d5b8b49db40a","Type":"ContainerDied","Data":"198d90cdbbe693279c1c3317c5eaa9d691d7cab7de59c08b134879aa22164e8d"} Feb 23 00:21:23 crc kubenswrapper[4735]: I0223 00:21:23.494176 4735 scope.go:117] "RemoveContainer" containerID="e21db7adad795135838fd82bd194832136b9f46ac91e721b43a5ca4e2123d9dd" Feb 23 00:21:23 crc kubenswrapper[4735]: I0223 00:21:23.494220 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g7rfg" Feb 23 00:21:23 crc kubenswrapper[4735]: I0223 00:21:23.532643 4735 scope.go:117] "RemoveContainer" containerID="d4d5d9258fdb1006c525b2f59151d95c1577e68302a300983a714d0f1aae6dde" Feb 23 00:21:23 crc kubenswrapper[4735]: I0223 00:21:23.533153 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-55b89ddfb9-frnfs" podStartSLOduration=5.574874858 podStartE2EDuration="11.533123871s" podCreationTimestamp="2026-02-23 00:21:12 +0000 UTC" firstStartedPulling="2026-02-23 00:21:16.518119206 +0000 UTC m=+834.981665197" lastFinishedPulling="2026-02-23 00:21:22.476368229 +0000 UTC m=+840.939914210" observedRunningTime="2026-02-23 00:21:23.517010365 +0000 UTC m=+841.980556406" watchObservedRunningTime="2026-02-23 00:21:23.533123871 +0000 UTC m=+841.996669882" Feb 23 00:21:23 crc kubenswrapper[4735]: I0223 00:21:23.571288 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g7rfg"] Feb 23 00:21:23 crc kubenswrapper[4735]: I0223 00:21:23.577973 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-g7rfg"] Feb 23 00:21:23 crc kubenswrapper[4735]: I0223 00:21:23.596986 4735 scope.go:117] "RemoveContainer" containerID="0edf3c3ae8c3fb27d97db756db9fc94301632e9e3dd49613d9d2e359a798756c" Feb 23 00:21:24 crc kubenswrapper[4735]: I0223 00:21:24.281008 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53b30bf2-54de-46f8-a567-d5b8b49db40a" path="/var/lib/kubelet/pods/53b30bf2-54de-46f8-a567-d5b8b49db40a/volumes" Feb 23 00:21:43 crc kubenswrapper[4735]: I0223 00:21:43.906605 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-ss9cq"] Feb 23 00:21:43 crc kubenswrapper[4735]: E0223 00:21:43.907468 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53b30bf2-54de-46f8-a567-d5b8b49db40a" containerName="extract-content" Feb 23 00:21:43 crc kubenswrapper[4735]: I0223 00:21:43.907483 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="53b30bf2-54de-46f8-a567-d5b8b49db40a" containerName="extract-content" Feb 23 00:21:43 crc kubenswrapper[4735]: E0223 00:21:43.907493 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53b30bf2-54de-46f8-a567-d5b8b49db40a" containerName="extract-utilities" Feb 23 00:21:43 crc kubenswrapper[4735]: I0223 00:21:43.907500 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="53b30bf2-54de-46f8-a567-d5b8b49db40a" containerName="extract-utilities" Feb 23 00:21:43 crc kubenswrapper[4735]: E0223 00:21:43.907511 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53b30bf2-54de-46f8-a567-d5b8b49db40a" containerName="registry-server" Feb 23 00:21:43 crc kubenswrapper[4735]: I0223 00:21:43.907517 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="53b30bf2-54de-46f8-a567-d5b8b49db40a" containerName="registry-server" Feb 23 00:21:43 crc kubenswrapper[4735]: I0223 00:21:43.907615 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="53b30bf2-54de-46f8-a567-d5b8b49db40a" containerName="registry-server" Feb 23 00:21:43 crc kubenswrapper[4735]: I0223 00:21:43.908088 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-ss9cq" Feb 23 00:21:43 crc kubenswrapper[4735]: I0223 00:21:43.912713 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-credentials" Feb 23 00:21:43 crc kubenswrapper[4735]: I0223 00:21:43.914731 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-dockercfg-vtq5g" Feb 23 00:21:43 crc kubenswrapper[4735]: I0223 00:21:43.916432 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-users" Feb 23 00:21:43 crc kubenswrapper[4735]: I0223 00:21:43.916502 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-ca" Feb 23 00:21:43 crc kubenswrapper[4735]: I0223 00:21:43.916502 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-ca" Feb 23 00:21:43 crc kubenswrapper[4735]: I0223 00:21:43.923809 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-ss9cq"] Feb 23 00:21:43 crc kubenswrapper[4735]: I0223 00:21:43.924614 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-interconnect-sasl-config" Feb 23 00:21:43 crc kubenswrapper[4735]: I0223 00:21:43.924930 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-credentials" Feb 23 00:21:44 crc kubenswrapper[4735]: I0223 00:21:44.104282 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/a566807c-75ea-4b30-b9bd-dd211d0eca22-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-ss9cq\" (UID: \"a566807c-75ea-4b30-b9bd-dd211d0eca22\") " pod="service-telemetry/default-interconnect-68864d46cb-ss9cq" Feb 23 00:21:44 crc kubenswrapper[4735]: I0223 00:21:44.104370 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54vtx\" (UniqueName: \"kubernetes.io/projected/a566807c-75ea-4b30-b9bd-dd211d0eca22-kube-api-access-54vtx\") pod \"default-interconnect-68864d46cb-ss9cq\" (UID: \"a566807c-75ea-4b30-b9bd-dd211d0eca22\") " pod="service-telemetry/default-interconnect-68864d46cb-ss9cq" Feb 23 00:21:44 crc kubenswrapper[4735]: I0223 00:21:44.104454 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/a566807c-75ea-4b30-b9bd-dd211d0eca22-sasl-users\") pod \"default-interconnect-68864d46cb-ss9cq\" (UID: \"a566807c-75ea-4b30-b9bd-dd211d0eca22\") " pod="service-telemetry/default-interconnect-68864d46cb-ss9cq" Feb 23 00:21:44 crc kubenswrapper[4735]: I0223 00:21:44.104566 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/a566807c-75ea-4b30-b9bd-dd211d0eca22-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-ss9cq\" (UID: \"a566807c-75ea-4b30-b9bd-dd211d0eca22\") " pod="service-telemetry/default-interconnect-68864d46cb-ss9cq" Feb 23 00:21:44 crc kubenswrapper[4735]: I0223 00:21:44.104629 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/a566807c-75ea-4b30-b9bd-dd211d0eca22-sasl-config\") pod \"default-interconnect-68864d46cb-ss9cq\" (UID: \"a566807c-75ea-4b30-b9bd-dd211d0eca22\") " pod="service-telemetry/default-interconnect-68864d46cb-ss9cq" Feb 23 00:21:44 crc kubenswrapper[4735]: I0223 00:21:44.104722 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/a566807c-75ea-4b30-b9bd-dd211d0eca22-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-ss9cq\" (UID: \"a566807c-75ea-4b30-b9bd-dd211d0eca22\") " pod="service-telemetry/default-interconnect-68864d46cb-ss9cq" Feb 23 00:21:44 crc kubenswrapper[4735]: I0223 00:21:44.104924 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/a566807c-75ea-4b30-b9bd-dd211d0eca22-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-ss9cq\" (UID: \"a566807c-75ea-4b30-b9bd-dd211d0eca22\") " pod="service-telemetry/default-interconnect-68864d46cb-ss9cq" Feb 23 00:21:44 crc kubenswrapper[4735]: I0223 00:21:44.206157 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/a566807c-75ea-4b30-b9bd-dd211d0eca22-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-ss9cq\" (UID: \"a566807c-75ea-4b30-b9bd-dd211d0eca22\") " pod="service-telemetry/default-interconnect-68864d46cb-ss9cq" Feb 23 00:21:44 crc kubenswrapper[4735]: I0223 00:21:44.206211 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54vtx\" (UniqueName: \"kubernetes.io/projected/a566807c-75ea-4b30-b9bd-dd211d0eca22-kube-api-access-54vtx\") pod \"default-interconnect-68864d46cb-ss9cq\" (UID: \"a566807c-75ea-4b30-b9bd-dd211d0eca22\") " pod="service-telemetry/default-interconnect-68864d46cb-ss9cq" Feb 23 00:21:44 crc kubenswrapper[4735]: I0223 00:21:44.206244 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/a566807c-75ea-4b30-b9bd-dd211d0eca22-sasl-users\") pod \"default-interconnect-68864d46cb-ss9cq\" (UID: \"a566807c-75ea-4b30-b9bd-dd211d0eca22\") " pod="service-telemetry/default-interconnect-68864d46cb-ss9cq" Feb 23 00:21:44 crc kubenswrapper[4735]: I0223 00:21:44.206285 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/a566807c-75ea-4b30-b9bd-dd211d0eca22-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-ss9cq\" (UID: \"a566807c-75ea-4b30-b9bd-dd211d0eca22\") " pod="service-telemetry/default-interconnect-68864d46cb-ss9cq" Feb 23 00:21:44 crc kubenswrapper[4735]: I0223 00:21:44.206315 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/a566807c-75ea-4b30-b9bd-dd211d0eca22-sasl-config\") pod \"default-interconnect-68864d46cb-ss9cq\" (UID: \"a566807c-75ea-4b30-b9bd-dd211d0eca22\") " pod="service-telemetry/default-interconnect-68864d46cb-ss9cq" Feb 23 00:21:44 crc kubenswrapper[4735]: I0223 00:21:44.206338 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/a566807c-75ea-4b30-b9bd-dd211d0eca22-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-ss9cq\" (UID: \"a566807c-75ea-4b30-b9bd-dd211d0eca22\") " pod="service-telemetry/default-interconnect-68864d46cb-ss9cq" Feb 23 00:21:44 crc kubenswrapper[4735]: I0223 00:21:44.206388 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/a566807c-75ea-4b30-b9bd-dd211d0eca22-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-ss9cq\" (UID: \"a566807c-75ea-4b30-b9bd-dd211d0eca22\") " pod="service-telemetry/default-interconnect-68864d46cb-ss9cq" Feb 23 00:21:44 crc kubenswrapper[4735]: I0223 00:21:44.207383 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/a566807c-75ea-4b30-b9bd-dd211d0eca22-sasl-config\") pod \"default-interconnect-68864d46cb-ss9cq\" (UID: \"a566807c-75ea-4b30-b9bd-dd211d0eca22\") " pod="service-telemetry/default-interconnect-68864d46cb-ss9cq" Feb 23 00:21:44 crc kubenswrapper[4735]: I0223 00:21:44.211360 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/a566807c-75ea-4b30-b9bd-dd211d0eca22-sasl-users\") pod \"default-interconnect-68864d46cb-ss9cq\" (UID: \"a566807c-75ea-4b30-b9bd-dd211d0eca22\") " pod="service-telemetry/default-interconnect-68864d46cb-ss9cq" Feb 23 00:21:44 crc kubenswrapper[4735]: I0223 00:21:44.211907 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/a566807c-75ea-4b30-b9bd-dd211d0eca22-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-ss9cq\" (UID: \"a566807c-75ea-4b30-b9bd-dd211d0eca22\") " pod="service-telemetry/default-interconnect-68864d46cb-ss9cq" Feb 23 00:21:44 crc kubenswrapper[4735]: I0223 00:21:44.213025 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/a566807c-75ea-4b30-b9bd-dd211d0eca22-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-ss9cq\" (UID: \"a566807c-75ea-4b30-b9bd-dd211d0eca22\") " pod="service-telemetry/default-interconnect-68864d46cb-ss9cq" Feb 23 00:21:44 crc kubenswrapper[4735]: I0223 00:21:44.214506 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/a566807c-75ea-4b30-b9bd-dd211d0eca22-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-ss9cq\" (UID: \"a566807c-75ea-4b30-b9bd-dd211d0eca22\") " pod="service-telemetry/default-interconnect-68864d46cb-ss9cq" Feb 23 00:21:44 crc kubenswrapper[4735]: I0223 00:21:44.214727 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/a566807c-75ea-4b30-b9bd-dd211d0eca22-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-ss9cq\" (UID: \"a566807c-75ea-4b30-b9bd-dd211d0eca22\") " pod="service-telemetry/default-interconnect-68864d46cb-ss9cq" Feb 23 00:21:44 crc kubenswrapper[4735]: I0223 00:21:44.225895 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54vtx\" (UniqueName: \"kubernetes.io/projected/a566807c-75ea-4b30-b9bd-dd211d0eca22-kube-api-access-54vtx\") pod \"default-interconnect-68864d46cb-ss9cq\" (UID: \"a566807c-75ea-4b30-b9bd-dd211d0eca22\") " pod="service-telemetry/default-interconnect-68864d46cb-ss9cq" Feb 23 00:21:44 crc kubenswrapper[4735]: I0223 00:21:44.522712 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-ss9cq" Feb 23 00:21:44 crc kubenswrapper[4735]: I0223 00:21:44.989026 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-ss9cq"] Feb 23 00:21:45 crc kubenswrapper[4735]: I0223 00:21:45.682244 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-ss9cq" event={"ID":"a566807c-75ea-4b30-b9bd-dd211d0eca22","Type":"ContainerStarted","Data":"ce9a21a3010c09df1cf440ec770e8c18cc635aa3a5bfcf76f78ef15616894285"} Feb 23 00:21:50 crc kubenswrapper[4735]: I0223 00:21:50.710401 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-snqst"] Feb 23 00:21:50 crc kubenswrapper[4735]: I0223 00:21:50.712008 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-snqst" Feb 23 00:21:50 crc kubenswrapper[4735]: I0223 00:21:50.720068 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-snqst"] Feb 23 00:21:50 crc kubenswrapper[4735]: I0223 00:21:50.811137 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bb4060c-12c1-4210-9a87-ccc7f6bdc98c-catalog-content\") pod \"community-operators-snqst\" (UID: \"8bb4060c-12c1-4210-9a87-ccc7f6bdc98c\") " pod="openshift-marketplace/community-operators-snqst" Feb 23 00:21:50 crc kubenswrapper[4735]: I0223 00:21:50.811178 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bb4060c-12c1-4210-9a87-ccc7f6bdc98c-utilities\") pod \"community-operators-snqst\" (UID: \"8bb4060c-12c1-4210-9a87-ccc7f6bdc98c\") " pod="openshift-marketplace/community-operators-snqst" Feb 23 00:21:50 crc kubenswrapper[4735]: I0223 00:21:50.811218 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqhcm\" (UniqueName: \"kubernetes.io/projected/8bb4060c-12c1-4210-9a87-ccc7f6bdc98c-kube-api-access-jqhcm\") pod \"community-operators-snqst\" (UID: \"8bb4060c-12c1-4210-9a87-ccc7f6bdc98c\") " pod="openshift-marketplace/community-operators-snqst" Feb 23 00:21:50 crc kubenswrapper[4735]: I0223 00:21:50.913918 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bb4060c-12c1-4210-9a87-ccc7f6bdc98c-catalog-content\") pod \"community-operators-snqst\" (UID: \"8bb4060c-12c1-4210-9a87-ccc7f6bdc98c\") " pod="openshift-marketplace/community-operators-snqst" Feb 23 00:21:50 crc kubenswrapper[4735]: I0223 00:21:50.913954 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bb4060c-12c1-4210-9a87-ccc7f6bdc98c-utilities\") pod \"community-operators-snqst\" (UID: \"8bb4060c-12c1-4210-9a87-ccc7f6bdc98c\") " pod="openshift-marketplace/community-operators-snqst" Feb 23 00:21:50 crc kubenswrapper[4735]: I0223 00:21:50.913999 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqhcm\" (UniqueName: \"kubernetes.io/projected/8bb4060c-12c1-4210-9a87-ccc7f6bdc98c-kube-api-access-jqhcm\") pod \"community-operators-snqst\" (UID: \"8bb4060c-12c1-4210-9a87-ccc7f6bdc98c\") " pod="openshift-marketplace/community-operators-snqst" Feb 23 00:21:50 crc kubenswrapper[4735]: I0223 00:21:50.914463 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bb4060c-12c1-4210-9a87-ccc7f6bdc98c-catalog-content\") pod \"community-operators-snqst\" (UID: \"8bb4060c-12c1-4210-9a87-ccc7f6bdc98c\") " pod="openshift-marketplace/community-operators-snqst" Feb 23 00:21:50 crc kubenswrapper[4735]: I0223 00:21:50.914619 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bb4060c-12c1-4210-9a87-ccc7f6bdc98c-utilities\") pod \"community-operators-snqst\" (UID: \"8bb4060c-12c1-4210-9a87-ccc7f6bdc98c\") " pod="openshift-marketplace/community-operators-snqst" Feb 23 00:21:50 crc kubenswrapper[4735]: I0223 00:21:50.941779 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqhcm\" (UniqueName: \"kubernetes.io/projected/8bb4060c-12c1-4210-9a87-ccc7f6bdc98c-kube-api-access-jqhcm\") pod \"community-operators-snqst\" (UID: \"8bb4060c-12c1-4210-9a87-ccc7f6bdc98c\") " pod="openshift-marketplace/community-operators-snqst" Feb 23 00:21:51 crc kubenswrapper[4735]: I0223 00:21:51.028883 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-snqst" Feb 23 00:21:51 crc kubenswrapper[4735]: I0223 00:21:51.546037 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-snqst"] Feb 23 00:21:51 crc kubenswrapper[4735]: W0223 00:21:51.555763 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bb4060c_12c1_4210_9a87_ccc7f6bdc98c.slice/crio-314c8eb37f56186262fdaa6f16660c72176e540cf3d53f82b78740525a81c18e WatchSource:0}: Error finding container 314c8eb37f56186262fdaa6f16660c72176e540cf3d53f82b78740525a81c18e: Status 404 returned error can't find the container with id 314c8eb37f56186262fdaa6f16660c72176e540cf3d53f82b78740525a81c18e Feb 23 00:21:51 crc kubenswrapper[4735]: I0223 00:21:51.745736 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-ss9cq" event={"ID":"a566807c-75ea-4b30-b9bd-dd211d0eca22","Type":"ContainerStarted","Data":"6150444994c5ebba54b86903649f02f601d756db603f84578e777f37bc678ca5"} Feb 23 00:21:51 crc kubenswrapper[4735]: I0223 00:21:51.748216 4735 generic.go:334] "Generic (PLEG): container finished" podID="8bb4060c-12c1-4210-9a87-ccc7f6bdc98c" containerID="5e168803689f9bcc2ed4663d492dd418962f5b12542ead0829d6f3a8f4edcea7" exitCode=0 Feb 23 00:21:51 crc kubenswrapper[4735]: I0223 00:21:51.748266 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-snqst" event={"ID":"8bb4060c-12c1-4210-9a87-ccc7f6bdc98c","Type":"ContainerDied","Data":"5e168803689f9bcc2ed4663d492dd418962f5b12542ead0829d6f3a8f4edcea7"} Feb 23 00:21:51 crc kubenswrapper[4735]: I0223 00:21:51.748295 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-snqst" event={"ID":"8bb4060c-12c1-4210-9a87-ccc7f6bdc98c","Type":"ContainerStarted","Data":"314c8eb37f56186262fdaa6f16660c72176e540cf3d53f82b78740525a81c18e"} Feb 23 00:21:51 crc kubenswrapper[4735]: I0223 00:21:51.765137 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-ss9cq" podStartSLOduration=2.614401892 podStartE2EDuration="8.765085276s" podCreationTimestamp="2026-02-23 00:21:43 +0000 UTC" firstStartedPulling="2026-02-23 00:21:44.997318023 +0000 UTC m=+863.460864014" lastFinishedPulling="2026-02-23 00:21:51.148001427 +0000 UTC m=+869.611547398" observedRunningTime="2026-02-23 00:21:51.764188495 +0000 UTC m=+870.227734466" watchObservedRunningTime="2026-02-23 00:21:51.765085276 +0000 UTC m=+870.228631247" Feb 23 00:21:52 crc kubenswrapper[4735]: I0223 00:21:52.760650 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-snqst" event={"ID":"8bb4060c-12c1-4210-9a87-ccc7f6bdc98c","Type":"ContainerStarted","Data":"486f1259f91c76db361c478d37117848fe41fe93032d1e7ef4c204235a123fe9"} Feb 23 00:21:53 crc kubenswrapper[4735]: I0223 00:21:53.774942 4735 generic.go:334] "Generic (PLEG): container finished" podID="8bb4060c-12c1-4210-9a87-ccc7f6bdc98c" containerID="486f1259f91c76db361c478d37117848fe41fe93032d1e7ef4c204235a123fe9" exitCode=0 Feb 23 00:21:53 crc kubenswrapper[4735]: I0223 00:21:53.775018 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-snqst" event={"ID":"8bb4060c-12c1-4210-9a87-ccc7f6bdc98c","Type":"ContainerDied","Data":"486f1259f91c76db361c478d37117848fe41fe93032d1e7ef4c204235a123fe9"} Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.153300 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-default-0"] Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.159335 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.170485 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-session-secret" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.170977 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-prometheus-proxy-tls" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.171043 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-web-config" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.171068 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"serving-certs-ca-bundle" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.173146 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-1" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.173167 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-2" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.173492 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.173589 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.179317 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-tls-assets-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.179317 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-stf-dockercfg-xdrmh" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.187080 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.272827 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/f1292e6a-8c09-4732-8766-c0ebcefbde0a-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.273122 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1292e6a-8c09-4732-8766-c0ebcefbde0a-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.273291 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/f1292e6a-8c09-4732-8766-c0ebcefbde0a-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.273450 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f1292e6a-8c09-4732-8766-c0ebcefbde0a-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.273621 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f1292e6a-8c09-4732-8766-c0ebcefbde0a-web-config\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.273779 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f1292e6a-8c09-4732-8766-c0ebcefbde0a-config\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.273921 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f1292e6a-8c09-4732-8766-c0ebcefbde0a-tls-assets\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.274083 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f0072827-c79a-4cd1-880c-0790aa49c2b6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f0072827-c79a-4cd1-880c-0790aa49c2b6\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.274241 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f1292e6a-8c09-4732-8766-c0ebcefbde0a-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.274376 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f1292e6a-8c09-4732-8766-c0ebcefbde0a-config-out\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.274573 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5xz8\" (UniqueName: \"kubernetes.io/projected/f1292e6a-8c09-4732-8766-c0ebcefbde0a-kube-api-access-j5xz8\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.274737 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/f1292e6a-8c09-4732-8766-c0ebcefbde0a-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.376471 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1292e6a-8c09-4732-8766-c0ebcefbde0a-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.376523 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/f1292e6a-8c09-4732-8766-c0ebcefbde0a-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.376573 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f1292e6a-8c09-4732-8766-c0ebcefbde0a-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.376617 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f1292e6a-8c09-4732-8766-c0ebcefbde0a-web-config\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.376644 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f1292e6a-8c09-4732-8766-c0ebcefbde0a-config\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.376667 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f1292e6a-8c09-4732-8766-c0ebcefbde0a-tls-assets\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.376738 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f0072827-c79a-4cd1-880c-0790aa49c2b6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f0072827-c79a-4cd1-880c-0790aa49c2b6\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.376763 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f1292e6a-8c09-4732-8766-c0ebcefbde0a-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.376793 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f1292e6a-8c09-4732-8766-c0ebcefbde0a-config-out\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.376819 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5xz8\" (UniqueName: \"kubernetes.io/projected/f1292e6a-8c09-4732-8766-c0ebcefbde0a-kube-api-access-j5xz8\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.376842 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/f1292e6a-8c09-4732-8766-c0ebcefbde0a-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.376890 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/f1292e6a-8c09-4732-8766-c0ebcefbde0a-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: E0223 00:21:54.377133 4735 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Feb 23 00:21:54 crc kubenswrapper[4735]: E0223 00:21:54.377202 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1292e6a-8c09-4732-8766-c0ebcefbde0a-secret-default-prometheus-proxy-tls podName:f1292e6a-8c09-4732-8766-c0ebcefbde0a nodeName:}" failed. No retries permitted until 2026-02-23 00:21:54.877169666 +0000 UTC m=+873.340715647 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/f1292e6a-8c09-4732-8766-c0ebcefbde0a-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "f1292e6a-8c09-4732-8766-c0ebcefbde0a") : secret "default-prometheus-proxy-tls" not found Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.377646 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1292e6a-8c09-4732-8766-c0ebcefbde0a-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.377899 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/f1292e6a-8c09-4732-8766-c0ebcefbde0a-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.378010 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f1292e6a-8c09-4732-8766-c0ebcefbde0a-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.381289 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/f1292e6a-8c09-4732-8766-c0ebcefbde0a-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.390401 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f1292e6a-8c09-4732-8766-c0ebcefbde0a-config-out\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.391082 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f1292e6a-8c09-4732-8766-c0ebcefbde0a-tls-assets\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.391168 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f1292e6a-8c09-4732-8766-c0ebcefbde0a-config\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.391919 4735 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.391971 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f0072827-c79a-4cd1-880c-0790aa49c2b6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f0072827-c79a-4cd1-880c-0790aa49c2b6\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/31ece88c85bf1eca0e5c28d9f8b8296ef7fdbea0f015cf579278350ca99575a6/globalmount\"" pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.392074 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f1292e6a-8c09-4732-8766-c0ebcefbde0a-web-config\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.401006 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/f1292e6a-8c09-4732-8766-c0ebcefbde0a-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.411468 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5xz8\" (UniqueName: \"kubernetes.io/projected/f1292e6a-8c09-4732-8766-c0ebcefbde0a-kube-api-access-j5xz8\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.433737 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f0072827-c79a-4cd1-880c-0790aa49c2b6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f0072827-c79a-4cd1-880c-0790aa49c2b6\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.788543 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-snqst" event={"ID":"8bb4060c-12c1-4210-9a87-ccc7f6bdc98c","Type":"ContainerStarted","Data":"a353b4bfe011831e8ac88b4fc94a405a52ac48d688442820ea266891e8c0429d"} Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.813569 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-snqst" podStartSLOduration=2.42402527 podStartE2EDuration="4.813543987s" podCreationTimestamp="2026-02-23 00:21:50 +0000 UTC" firstStartedPulling="2026-02-23 00:21:51.751465117 +0000 UTC m=+870.215011118" lastFinishedPulling="2026-02-23 00:21:54.140983854 +0000 UTC m=+872.604529835" observedRunningTime="2026-02-23 00:21:54.80828641 +0000 UTC m=+873.271832391" watchObservedRunningTime="2026-02-23 00:21:54.813543987 +0000 UTC m=+873.277089988" Feb 23 00:21:54 crc kubenswrapper[4735]: I0223 00:21:54.883069 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f1292e6a-8c09-4732-8766-c0ebcefbde0a-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:54 crc kubenswrapper[4735]: E0223 00:21:54.883989 4735 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Feb 23 00:21:54 crc kubenswrapper[4735]: E0223 00:21:54.884050 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1292e6a-8c09-4732-8766-c0ebcefbde0a-secret-default-prometheus-proxy-tls podName:f1292e6a-8c09-4732-8766-c0ebcefbde0a nodeName:}" failed. No retries permitted until 2026-02-23 00:21:55.884032675 +0000 UTC m=+874.347578656 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/f1292e6a-8c09-4732-8766-c0ebcefbde0a-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "f1292e6a-8c09-4732-8766-c0ebcefbde0a") : secret "default-prometheus-proxy-tls" not found Feb 23 00:21:55 crc kubenswrapper[4735]: I0223 00:21:55.896053 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f1292e6a-8c09-4732-8766-c0ebcefbde0a-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:55 crc kubenswrapper[4735]: I0223 00:21:55.901311 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/f1292e6a-8c09-4732-8766-c0ebcefbde0a-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"f1292e6a-8c09-4732-8766-c0ebcefbde0a\") " pod="service-telemetry/prometheus-default-0" Feb 23 00:21:55 crc kubenswrapper[4735]: I0223 00:21:55.989034 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Feb 23 00:21:56 crc kubenswrapper[4735]: W0223 00:21:56.444382 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1292e6a_8c09_4732_8766_c0ebcefbde0a.slice/crio-6c9b2a747dd741038b5a4b1aeb8f1682f1e089d70ad820bbf26cdab030607b16 WatchSource:0}: Error finding container 6c9b2a747dd741038b5a4b1aeb8f1682f1e089d70ad820bbf26cdab030607b16: Status 404 returned error can't find the container with id 6c9b2a747dd741038b5a4b1aeb8f1682f1e089d70ad820bbf26cdab030607b16 Feb 23 00:21:56 crc kubenswrapper[4735]: I0223 00:21:56.446717 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Feb 23 00:21:56 crc kubenswrapper[4735]: I0223 00:21:56.809112 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"f1292e6a-8c09-4732-8766-c0ebcefbde0a","Type":"ContainerStarted","Data":"6c9b2a747dd741038b5a4b1aeb8f1682f1e089d70ad820bbf26cdab030607b16"} Feb 23 00:22:00 crc kubenswrapper[4735]: I0223 00:22:00.625260 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qxcn4"] Feb 23 00:22:00 crc kubenswrapper[4735]: I0223 00:22:00.628153 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qxcn4" Feb 23 00:22:00 crc kubenswrapper[4735]: I0223 00:22:00.641072 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qxcn4"] Feb 23 00:22:00 crc kubenswrapper[4735]: I0223 00:22:00.660168 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59bda72c-8385-4452-9328-a56b1dab7114-utilities\") pod \"certified-operators-qxcn4\" (UID: \"59bda72c-8385-4452-9328-a56b1dab7114\") " pod="openshift-marketplace/certified-operators-qxcn4" Feb 23 00:22:00 crc kubenswrapper[4735]: I0223 00:22:00.660260 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59bda72c-8385-4452-9328-a56b1dab7114-catalog-content\") pod \"certified-operators-qxcn4\" (UID: \"59bda72c-8385-4452-9328-a56b1dab7114\") " pod="openshift-marketplace/certified-operators-qxcn4" Feb 23 00:22:00 crc kubenswrapper[4735]: I0223 00:22:00.660312 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztpff\" (UniqueName: \"kubernetes.io/projected/59bda72c-8385-4452-9328-a56b1dab7114-kube-api-access-ztpff\") pod \"certified-operators-qxcn4\" (UID: \"59bda72c-8385-4452-9328-a56b1dab7114\") " pod="openshift-marketplace/certified-operators-qxcn4" Feb 23 00:22:00 crc kubenswrapper[4735]: I0223 00:22:00.761303 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59bda72c-8385-4452-9328-a56b1dab7114-utilities\") pod \"certified-operators-qxcn4\" (UID: \"59bda72c-8385-4452-9328-a56b1dab7114\") " pod="openshift-marketplace/certified-operators-qxcn4" Feb 23 00:22:00 crc kubenswrapper[4735]: I0223 00:22:00.761407 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59bda72c-8385-4452-9328-a56b1dab7114-catalog-content\") pod \"certified-operators-qxcn4\" (UID: \"59bda72c-8385-4452-9328-a56b1dab7114\") " pod="openshift-marketplace/certified-operators-qxcn4" Feb 23 00:22:00 crc kubenswrapper[4735]: I0223 00:22:00.761438 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztpff\" (UniqueName: \"kubernetes.io/projected/59bda72c-8385-4452-9328-a56b1dab7114-kube-api-access-ztpff\") pod \"certified-operators-qxcn4\" (UID: \"59bda72c-8385-4452-9328-a56b1dab7114\") " pod="openshift-marketplace/certified-operators-qxcn4" Feb 23 00:22:00 crc kubenswrapper[4735]: I0223 00:22:00.761769 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59bda72c-8385-4452-9328-a56b1dab7114-utilities\") pod \"certified-operators-qxcn4\" (UID: \"59bda72c-8385-4452-9328-a56b1dab7114\") " pod="openshift-marketplace/certified-operators-qxcn4" Feb 23 00:22:00 crc kubenswrapper[4735]: I0223 00:22:00.762026 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59bda72c-8385-4452-9328-a56b1dab7114-catalog-content\") pod \"certified-operators-qxcn4\" (UID: \"59bda72c-8385-4452-9328-a56b1dab7114\") " pod="openshift-marketplace/certified-operators-qxcn4" Feb 23 00:22:00 crc kubenswrapper[4735]: I0223 00:22:00.794042 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztpff\" (UniqueName: \"kubernetes.io/projected/59bda72c-8385-4452-9328-a56b1dab7114-kube-api-access-ztpff\") pod \"certified-operators-qxcn4\" (UID: \"59bda72c-8385-4452-9328-a56b1dab7114\") " pod="openshift-marketplace/certified-operators-qxcn4" Feb 23 00:22:00 crc kubenswrapper[4735]: I0223 00:22:00.987756 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qxcn4" Feb 23 00:22:01 crc kubenswrapper[4735]: I0223 00:22:01.030015 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-snqst" Feb 23 00:22:01 crc kubenswrapper[4735]: I0223 00:22:01.030084 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-snqst" Feb 23 00:22:01 crc kubenswrapper[4735]: I0223 00:22:01.091172 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-snqst" Feb 23 00:22:01 crc kubenswrapper[4735]: I0223 00:22:01.493183 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qxcn4"] Feb 23 00:22:01 crc kubenswrapper[4735]: I0223 00:22:01.842820 4735 generic.go:334] "Generic (PLEG): container finished" podID="59bda72c-8385-4452-9328-a56b1dab7114" containerID="2321d7b190996a1df010f0e73049d3b87f9da30344f1f3f6c90b9fcf810bc9a0" exitCode=0 Feb 23 00:22:01 crc kubenswrapper[4735]: I0223 00:22:01.842893 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxcn4" event={"ID":"59bda72c-8385-4452-9328-a56b1dab7114","Type":"ContainerDied","Data":"2321d7b190996a1df010f0e73049d3b87f9da30344f1f3f6c90b9fcf810bc9a0"} Feb 23 00:22:01 crc kubenswrapper[4735]: I0223 00:22:01.842936 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxcn4" event={"ID":"59bda72c-8385-4452-9328-a56b1dab7114","Type":"ContainerStarted","Data":"dc080369d9c3323de22f954883814aa2e794e61bec0b6285ba63f8a3153121cd"} Feb 23 00:22:01 crc kubenswrapper[4735]: I0223 00:22:01.845758 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"f1292e6a-8c09-4732-8766-c0ebcefbde0a","Type":"ContainerStarted","Data":"0530a7f8d93f8fccbbae5be8c5b0de0659859d7336ed95b7410153cab798968d"} Feb 23 00:22:01 crc kubenswrapper[4735]: I0223 00:22:01.902487 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-snqst" Feb 23 00:22:02 crc kubenswrapper[4735]: I0223 00:22:02.857313 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxcn4" event={"ID":"59bda72c-8385-4452-9328-a56b1dab7114","Type":"ContainerStarted","Data":"77c8085ce9267ce497e861a970738a5f38ffc7d7372048d4c8ded814a4907b7f"} Feb 23 00:22:03 crc kubenswrapper[4735]: I0223 00:22:03.384440 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-snqst"] Feb 23 00:22:03 crc kubenswrapper[4735]: I0223 00:22:03.863539 4735 generic.go:334] "Generic (PLEG): container finished" podID="59bda72c-8385-4452-9328-a56b1dab7114" containerID="77c8085ce9267ce497e861a970738a5f38ffc7d7372048d4c8ded814a4907b7f" exitCode=0 Feb 23 00:22:03 crc kubenswrapper[4735]: I0223 00:22:03.863722 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-snqst" podUID="8bb4060c-12c1-4210-9a87-ccc7f6bdc98c" containerName="registry-server" containerID="cri-o://a353b4bfe011831e8ac88b4fc94a405a52ac48d688442820ea266891e8c0429d" gracePeriod=2 Feb 23 00:22:03 crc kubenswrapper[4735]: I0223 00:22:03.864387 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxcn4" event={"ID":"59bda72c-8385-4452-9328-a56b1dab7114","Type":"ContainerDied","Data":"77c8085ce9267ce497e861a970738a5f38ffc7d7372048d4c8ded814a4907b7f"} Feb 23 00:22:04 crc kubenswrapper[4735]: I0223 00:22:04.011468 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-snmp-webhook-78bcbbdcff-52mm9"] Feb 23 00:22:04 crc kubenswrapper[4735]: I0223 00:22:04.012531 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-78bcbbdcff-52mm9" Feb 23 00:22:04 crc kubenswrapper[4735]: I0223 00:22:04.022421 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-78bcbbdcff-52mm9"] Feb 23 00:22:04 crc kubenswrapper[4735]: I0223 00:22:04.106581 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lg65\" (UniqueName: \"kubernetes.io/projected/213ffefc-86ab-4944-a9b5-888a99455d05-kube-api-access-9lg65\") pod \"default-snmp-webhook-78bcbbdcff-52mm9\" (UID: \"213ffefc-86ab-4944-a9b5-888a99455d05\") " pod="service-telemetry/default-snmp-webhook-78bcbbdcff-52mm9" Feb 23 00:22:04 crc kubenswrapper[4735]: I0223 00:22:04.207737 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lg65\" (UniqueName: \"kubernetes.io/projected/213ffefc-86ab-4944-a9b5-888a99455d05-kube-api-access-9lg65\") pod \"default-snmp-webhook-78bcbbdcff-52mm9\" (UID: \"213ffefc-86ab-4944-a9b5-888a99455d05\") " pod="service-telemetry/default-snmp-webhook-78bcbbdcff-52mm9" Feb 23 00:22:04 crc kubenswrapper[4735]: I0223 00:22:04.232466 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lg65\" (UniqueName: \"kubernetes.io/projected/213ffefc-86ab-4944-a9b5-888a99455d05-kube-api-access-9lg65\") pod \"default-snmp-webhook-78bcbbdcff-52mm9\" (UID: \"213ffefc-86ab-4944-a9b5-888a99455d05\") " pod="service-telemetry/default-snmp-webhook-78bcbbdcff-52mm9" Feb 23 00:22:04 crc kubenswrapper[4735]: I0223 00:22:04.301179 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-snqst" Feb 23 00:22:04 crc kubenswrapper[4735]: I0223 00:22:04.338118 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-78bcbbdcff-52mm9" Feb 23 00:22:04 crc kubenswrapper[4735]: I0223 00:22:04.410882 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bb4060c-12c1-4210-9a87-ccc7f6bdc98c-utilities\") pod \"8bb4060c-12c1-4210-9a87-ccc7f6bdc98c\" (UID: \"8bb4060c-12c1-4210-9a87-ccc7f6bdc98c\") " Feb 23 00:22:04 crc kubenswrapper[4735]: I0223 00:22:04.411255 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bb4060c-12c1-4210-9a87-ccc7f6bdc98c-catalog-content\") pod \"8bb4060c-12c1-4210-9a87-ccc7f6bdc98c\" (UID: \"8bb4060c-12c1-4210-9a87-ccc7f6bdc98c\") " Feb 23 00:22:04 crc kubenswrapper[4735]: I0223 00:22:04.411390 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqhcm\" (UniqueName: \"kubernetes.io/projected/8bb4060c-12c1-4210-9a87-ccc7f6bdc98c-kube-api-access-jqhcm\") pod \"8bb4060c-12c1-4210-9a87-ccc7f6bdc98c\" (UID: \"8bb4060c-12c1-4210-9a87-ccc7f6bdc98c\") " Feb 23 00:22:04 crc kubenswrapper[4735]: I0223 00:22:04.412554 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bb4060c-12c1-4210-9a87-ccc7f6bdc98c-utilities" (OuterVolumeSpecName: "utilities") pod "8bb4060c-12c1-4210-9a87-ccc7f6bdc98c" (UID: "8bb4060c-12c1-4210-9a87-ccc7f6bdc98c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:22:04 crc kubenswrapper[4735]: I0223 00:22:04.414834 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bb4060c-12c1-4210-9a87-ccc7f6bdc98c-kube-api-access-jqhcm" (OuterVolumeSpecName: "kube-api-access-jqhcm") pod "8bb4060c-12c1-4210-9a87-ccc7f6bdc98c" (UID: "8bb4060c-12c1-4210-9a87-ccc7f6bdc98c"). InnerVolumeSpecName "kube-api-access-jqhcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:22:04 crc kubenswrapper[4735]: I0223 00:22:04.458195 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bb4060c-12c1-4210-9a87-ccc7f6bdc98c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8bb4060c-12c1-4210-9a87-ccc7f6bdc98c" (UID: "8bb4060c-12c1-4210-9a87-ccc7f6bdc98c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:22:04 crc kubenswrapper[4735]: I0223 00:22:04.513602 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bb4060c-12c1-4210-9a87-ccc7f6bdc98c-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 00:22:04 crc kubenswrapper[4735]: I0223 00:22:04.513666 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bb4060c-12c1-4210-9a87-ccc7f6bdc98c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 00:22:04 crc kubenswrapper[4735]: I0223 00:22:04.513685 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqhcm\" (UniqueName: \"kubernetes.io/projected/8bb4060c-12c1-4210-9a87-ccc7f6bdc98c-kube-api-access-jqhcm\") on node \"crc\" DevicePath \"\"" Feb 23 00:22:04 crc kubenswrapper[4735]: I0223 00:22:04.772518 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-78bcbbdcff-52mm9"] Feb 23 00:22:04 crc kubenswrapper[4735]: W0223 00:22:04.780004 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod213ffefc_86ab_4944_a9b5_888a99455d05.slice/crio-31b9f772b0d1b9d8d67cb2ec5c4ea309e249484dfd8b618120c7be91ac07e01d WatchSource:0}: Error finding container 31b9f772b0d1b9d8d67cb2ec5c4ea309e249484dfd8b618120c7be91ac07e01d: Status 404 returned error can't find the container with id 31b9f772b0d1b9d8d67cb2ec5c4ea309e249484dfd8b618120c7be91ac07e01d Feb 23 00:22:04 crc kubenswrapper[4735]: I0223 00:22:04.872600 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-78bcbbdcff-52mm9" event={"ID":"213ffefc-86ab-4944-a9b5-888a99455d05","Type":"ContainerStarted","Data":"31b9f772b0d1b9d8d67cb2ec5c4ea309e249484dfd8b618120c7be91ac07e01d"} Feb 23 00:22:04 crc kubenswrapper[4735]: I0223 00:22:04.874724 4735 generic.go:334] "Generic (PLEG): container finished" podID="8bb4060c-12c1-4210-9a87-ccc7f6bdc98c" containerID="a353b4bfe011831e8ac88b4fc94a405a52ac48d688442820ea266891e8c0429d" exitCode=0 Feb 23 00:22:04 crc kubenswrapper[4735]: I0223 00:22:04.874780 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-snqst" event={"ID":"8bb4060c-12c1-4210-9a87-ccc7f6bdc98c","Type":"ContainerDied","Data":"a353b4bfe011831e8ac88b4fc94a405a52ac48d688442820ea266891e8c0429d"} Feb 23 00:22:04 crc kubenswrapper[4735]: I0223 00:22:04.874804 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-snqst" event={"ID":"8bb4060c-12c1-4210-9a87-ccc7f6bdc98c","Type":"ContainerDied","Data":"314c8eb37f56186262fdaa6f16660c72176e540cf3d53f82b78740525a81c18e"} Feb 23 00:22:04 crc kubenswrapper[4735]: I0223 00:22:04.874825 4735 scope.go:117] "RemoveContainer" containerID="a353b4bfe011831e8ac88b4fc94a405a52ac48d688442820ea266891e8c0429d" Feb 23 00:22:04 crc kubenswrapper[4735]: I0223 00:22:04.874990 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-snqst" Feb 23 00:22:04 crc kubenswrapper[4735]: I0223 00:22:04.882411 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxcn4" event={"ID":"59bda72c-8385-4452-9328-a56b1dab7114","Type":"ContainerStarted","Data":"6aee6a28e0aeeafac4e04ced615e53bd3c3b2775b386e2b811e18283fcf9f586"} Feb 23 00:22:04 crc kubenswrapper[4735]: I0223 00:22:04.910100 4735 scope.go:117] "RemoveContainer" containerID="486f1259f91c76db361c478d37117848fe41fe93032d1e7ef4c204235a123fe9" Feb 23 00:22:04 crc kubenswrapper[4735]: I0223 00:22:04.915960 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qxcn4" podStartSLOduration=2.540765716 podStartE2EDuration="4.915843403s" podCreationTimestamp="2026-02-23 00:22:00 +0000 UTC" firstStartedPulling="2026-02-23 00:22:01.845206995 +0000 UTC m=+880.308752956" lastFinishedPulling="2026-02-23 00:22:04.220284672 +0000 UTC m=+882.683830643" observedRunningTime="2026-02-23 00:22:04.910956704 +0000 UTC m=+883.374502685" watchObservedRunningTime="2026-02-23 00:22:04.915843403 +0000 UTC m=+883.379389374" Feb 23 00:22:04 crc kubenswrapper[4735]: I0223 00:22:04.938809 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-snqst"] Feb 23 00:22:04 crc kubenswrapper[4735]: I0223 00:22:04.945424 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-snqst"] Feb 23 00:22:04 crc kubenswrapper[4735]: I0223 00:22:04.964768 4735 scope.go:117] "RemoveContainer" containerID="5e168803689f9bcc2ed4663d492dd418962f5b12542ead0829d6f3a8f4edcea7" Feb 23 00:22:05 crc kubenswrapper[4735]: I0223 00:22:05.018996 4735 scope.go:117] "RemoveContainer" containerID="a353b4bfe011831e8ac88b4fc94a405a52ac48d688442820ea266891e8c0429d" Feb 23 00:22:05 crc kubenswrapper[4735]: E0223 00:22:05.019538 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a353b4bfe011831e8ac88b4fc94a405a52ac48d688442820ea266891e8c0429d\": container with ID starting with a353b4bfe011831e8ac88b4fc94a405a52ac48d688442820ea266891e8c0429d not found: ID does not exist" containerID="a353b4bfe011831e8ac88b4fc94a405a52ac48d688442820ea266891e8c0429d" Feb 23 00:22:05 crc kubenswrapper[4735]: I0223 00:22:05.019587 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a353b4bfe011831e8ac88b4fc94a405a52ac48d688442820ea266891e8c0429d"} err="failed to get container status \"a353b4bfe011831e8ac88b4fc94a405a52ac48d688442820ea266891e8c0429d\": rpc error: code = NotFound desc = could not find container \"a353b4bfe011831e8ac88b4fc94a405a52ac48d688442820ea266891e8c0429d\": container with ID starting with a353b4bfe011831e8ac88b4fc94a405a52ac48d688442820ea266891e8c0429d not found: ID does not exist" Feb 23 00:22:05 crc kubenswrapper[4735]: I0223 00:22:05.019616 4735 scope.go:117] "RemoveContainer" containerID="486f1259f91c76db361c478d37117848fe41fe93032d1e7ef4c204235a123fe9" Feb 23 00:22:05 crc kubenswrapper[4735]: E0223 00:22:05.019985 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"486f1259f91c76db361c478d37117848fe41fe93032d1e7ef4c204235a123fe9\": container with ID starting with 486f1259f91c76db361c478d37117848fe41fe93032d1e7ef4c204235a123fe9 not found: ID does not exist" containerID="486f1259f91c76db361c478d37117848fe41fe93032d1e7ef4c204235a123fe9" Feb 23 00:22:05 crc kubenswrapper[4735]: I0223 00:22:05.020013 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"486f1259f91c76db361c478d37117848fe41fe93032d1e7ef4c204235a123fe9"} err="failed to get container status \"486f1259f91c76db361c478d37117848fe41fe93032d1e7ef4c204235a123fe9\": rpc error: code = NotFound desc = could not find container \"486f1259f91c76db361c478d37117848fe41fe93032d1e7ef4c204235a123fe9\": container with ID starting with 486f1259f91c76db361c478d37117848fe41fe93032d1e7ef4c204235a123fe9 not found: ID does not exist" Feb 23 00:22:05 crc kubenswrapper[4735]: I0223 00:22:05.020029 4735 scope.go:117] "RemoveContainer" containerID="5e168803689f9bcc2ed4663d492dd418962f5b12542ead0829d6f3a8f4edcea7" Feb 23 00:22:05 crc kubenswrapper[4735]: E0223 00:22:05.020343 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e168803689f9bcc2ed4663d492dd418962f5b12542ead0829d6f3a8f4edcea7\": container with ID starting with 5e168803689f9bcc2ed4663d492dd418962f5b12542ead0829d6f3a8f4edcea7 not found: ID does not exist" containerID="5e168803689f9bcc2ed4663d492dd418962f5b12542ead0829d6f3a8f4edcea7" Feb 23 00:22:05 crc kubenswrapper[4735]: I0223 00:22:05.020382 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e168803689f9bcc2ed4663d492dd418962f5b12542ead0829d6f3a8f4edcea7"} err="failed to get container status \"5e168803689f9bcc2ed4663d492dd418962f5b12542ead0829d6f3a8f4edcea7\": rpc error: code = NotFound desc = could not find container \"5e168803689f9bcc2ed4663d492dd418962f5b12542ead0829d6f3a8f4edcea7\": container with ID starting with 5e168803689f9bcc2ed4663d492dd418962f5b12542ead0829d6f3a8f4edcea7 not found: ID does not exist" Feb 23 00:22:06 crc kubenswrapper[4735]: I0223 00:22:06.282703 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bb4060c-12c1-4210-9a87-ccc7f6bdc98c" path="/var/lib/kubelet/pods/8bb4060c-12c1-4210-9a87-ccc7f6bdc98c/volumes" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.421672 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/alertmanager-default-0"] Feb 23 00:22:07 crc kubenswrapper[4735]: E0223 00:22:07.422038 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb4060c-12c1-4210-9a87-ccc7f6bdc98c" containerName="extract-utilities" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.422055 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb4060c-12c1-4210-9a87-ccc7f6bdc98c" containerName="extract-utilities" Feb 23 00:22:07 crc kubenswrapper[4735]: E0223 00:22:07.422071 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb4060c-12c1-4210-9a87-ccc7f6bdc98c" containerName="extract-content" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.422080 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb4060c-12c1-4210-9a87-ccc7f6bdc98c" containerName="extract-content" Feb 23 00:22:07 crc kubenswrapper[4735]: E0223 00:22:07.422115 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bb4060c-12c1-4210-9a87-ccc7f6bdc98c" containerName="registry-server" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.422126 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bb4060c-12c1-4210-9a87-ccc7f6bdc98c" containerName="registry-server" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.422361 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bb4060c-12c1-4210-9a87-ccc7f6bdc98c" containerName="registry-server" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.426544 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.429668 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-cluster-tls-config" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.430093 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-generated" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.430516 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-alertmanager-proxy-tls" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.430552 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-tls-assets-0" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.430635 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-stf-dockercfg-h26hp" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.431915 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-web-config" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.439151 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.559904 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/1ad20943-871d-4331-a0e1-5145da93efee-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"1ad20943-871d-4331-a0e1-5145da93efee\") " pod="service-telemetry/alertmanager-default-0" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.559974 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ad20943-871d-4331-a0e1-5145da93efee-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"1ad20943-871d-4331-a0e1-5145da93efee\") " pod="service-telemetry/alertmanager-default-0" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.560221 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1ad20943-871d-4331-a0e1-5145da93efee-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"1ad20943-871d-4331-a0e1-5145da93efee\") " pod="service-telemetry/alertmanager-default-0" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.560270 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1ad20943-871d-4331-a0e1-5145da93efee-config-out\") pod \"alertmanager-default-0\" (UID: \"1ad20943-871d-4331-a0e1-5145da93efee\") " pod="service-telemetry/alertmanager-default-0" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.560365 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxjnc\" (UniqueName: \"kubernetes.io/projected/1ad20943-871d-4331-a0e1-5145da93efee-kube-api-access-fxjnc\") pod \"alertmanager-default-0\" (UID: \"1ad20943-871d-4331-a0e1-5145da93efee\") " pod="service-telemetry/alertmanager-default-0" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.560401 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1ad20943-871d-4331-a0e1-5145da93efee-web-config\") pod \"alertmanager-default-0\" (UID: \"1ad20943-871d-4331-a0e1-5145da93efee\") " pod="service-telemetry/alertmanager-default-0" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.560436 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1ad20943-871d-4331-a0e1-5145da93efee-tls-assets\") pod \"alertmanager-default-0\" (UID: \"1ad20943-871d-4331-a0e1-5145da93efee\") " pod="service-telemetry/alertmanager-default-0" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.560481 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ddd1f84a-ee6c-496d-83e5-7bf3dc7d86f5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ddd1f84a-ee6c-496d-83e5-7bf3dc7d86f5\") pod \"alertmanager-default-0\" (UID: \"1ad20943-871d-4331-a0e1-5145da93efee\") " pod="service-telemetry/alertmanager-default-0" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.560529 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1ad20943-871d-4331-a0e1-5145da93efee-config-volume\") pod \"alertmanager-default-0\" (UID: \"1ad20943-871d-4331-a0e1-5145da93efee\") " pod="service-telemetry/alertmanager-default-0" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.661937 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/1ad20943-871d-4331-a0e1-5145da93efee-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"1ad20943-871d-4331-a0e1-5145da93efee\") " pod="service-telemetry/alertmanager-default-0" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.662006 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ad20943-871d-4331-a0e1-5145da93efee-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"1ad20943-871d-4331-a0e1-5145da93efee\") " pod="service-telemetry/alertmanager-default-0" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.662082 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1ad20943-871d-4331-a0e1-5145da93efee-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"1ad20943-871d-4331-a0e1-5145da93efee\") " pod="service-telemetry/alertmanager-default-0" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.662110 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1ad20943-871d-4331-a0e1-5145da93efee-config-out\") pod \"alertmanager-default-0\" (UID: \"1ad20943-871d-4331-a0e1-5145da93efee\") " pod="service-telemetry/alertmanager-default-0" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.662144 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxjnc\" (UniqueName: \"kubernetes.io/projected/1ad20943-871d-4331-a0e1-5145da93efee-kube-api-access-fxjnc\") pod \"alertmanager-default-0\" (UID: \"1ad20943-871d-4331-a0e1-5145da93efee\") " pod="service-telemetry/alertmanager-default-0" Feb 23 00:22:07 crc kubenswrapper[4735]: E0223 00:22:07.662196 4735 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Feb 23 00:22:07 crc kubenswrapper[4735]: E0223 00:22:07.662268 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ad20943-871d-4331-a0e1-5145da93efee-secret-default-alertmanager-proxy-tls podName:1ad20943-871d-4331-a0e1-5145da93efee nodeName:}" failed. No retries permitted until 2026-02-23 00:22:08.162248896 +0000 UTC m=+886.625794867 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/1ad20943-871d-4331-a0e1-5145da93efee-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "1ad20943-871d-4331-a0e1-5145da93efee") : secret "default-alertmanager-proxy-tls" not found Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.663105 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1ad20943-871d-4331-a0e1-5145da93efee-web-config\") pod \"alertmanager-default-0\" (UID: \"1ad20943-871d-4331-a0e1-5145da93efee\") " pod="service-telemetry/alertmanager-default-0" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.663159 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1ad20943-871d-4331-a0e1-5145da93efee-tls-assets\") pod \"alertmanager-default-0\" (UID: \"1ad20943-871d-4331-a0e1-5145da93efee\") " pod="service-telemetry/alertmanager-default-0" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.663184 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ddd1f84a-ee6c-496d-83e5-7bf3dc7d86f5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ddd1f84a-ee6c-496d-83e5-7bf3dc7d86f5\") pod \"alertmanager-default-0\" (UID: \"1ad20943-871d-4331-a0e1-5145da93efee\") " pod="service-telemetry/alertmanager-default-0" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.663232 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1ad20943-871d-4331-a0e1-5145da93efee-config-volume\") pod \"alertmanager-default-0\" (UID: \"1ad20943-871d-4331-a0e1-5145da93efee\") " pod="service-telemetry/alertmanager-default-0" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.671222 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1ad20943-871d-4331-a0e1-5145da93efee-tls-assets\") pod \"alertmanager-default-0\" (UID: \"1ad20943-871d-4331-a0e1-5145da93efee\") " pod="service-telemetry/alertmanager-default-0" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.671319 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1ad20943-871d-4331-a0e1-5145da93efee-config-out\") pod \"alertmanager-default-0\" (UID: \"1ad20943-871d-4331-a0e1-5145da93efee\") " pod="service-telemetry/alertmanager-default-0" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.672015 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1ad20943-871d-4331-a0e1-5145da93efee-config-volume\") pod \"alertmanager-default-0\" (UID: \"1ad20943-871d-4331-a0e1-5145da93efee\") " pod="service-telemetry/alertmanager-default-0" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.672245 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1ad20943-871d-4331-a0e1-5145da93efee-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"1ad20943-871d-4331-a0e1-5145da93efee\") " pod="service-telemetry/alertmanager-default-0" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.675434 4735 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.675465 4735 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ddd1f84a-ee6c-496d-83e5-7bf3dc7d86f5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ddd1f84a-ee6c-496d-83e5-7bf3dc7d86f5\") pod \"alertmanager-default-0\" (UID: \"1ad20943-871d-4331-a0e1-5145da93efee\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fd02ac4c395b15ae59216cccff0b1fbe15d72c352e42cd0de3c2edda1a47a053/globalmount\"" pod="service-telemetry/alertmanager-default-0" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.678467 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/1ad20943-871d-4331-a0e1-5145da93efee-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"1ad20943-871d-4331-a0e1-5145da93efee\") " pod="service-telemetry/alertmanager-default-0" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.695309 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1ad20943-871d-4331-a0e1-5145da93efee-web-config\") pod \"alertmanager-default-0\" (UID: \"1ad20943-871d-4331-a0e1-5145da93efee\") " pod="service-telemetry/alertmanager-default-0" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.695614 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxjnc\" (UniqueName: \"kubernetes.io/projected/1ad20943-871d-4331-a0e1-5145da93efee-kube-api-access-fxjnc\") pod \"alertmanager-default-0\" (UID: \"1ad20943-871d-4331-a0e1-5145da93efee\") " pod="service-telemetry/alertmanager-default-0" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.721748 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ddd1f84a-ee6c-496d-83e5-7bf3dc7d86f5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ddd1f84a-ee6c-496d-83e5-7bf3dc7d86f5\") pod \"alertmanager-default-0\" (UID: \"1ad20943-871d-4331-a0e1-5145da93efee\") " pod="service-telemetry/alertmanager-default-0" Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.913325 4735 generic.go:334] "Generic (PLEG): container finished" podID="f1292e6a-8c09-4732-8766-c0ebcefbde0a" containerID="0530a7f8d93f8fccbbae5be8c5b0de0659859d7336ed95b7410153cab798968d" exitCode=0 Feb 23 00:22:07 crc kubenswrapper[4735]: I0223 00:22:07.913372 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"f1292e6a-8c09-4732-8766-c0ebcefbde0a","Type":"ContainerDied","Data":"0530a7f8d93f8fccbbae5be8c5b0de0659859d7336ed95b7410153cab798968d"} Feb 23 00:22:08 crc kubenswrapper[4735]: I0223 00:22:08.171958 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ad20943-871d-4331-a0e1-5145da93efee-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"1ad20943-871d-4331-a0e1-5145da93efee\") " pod="service-telemetry/alertmanager-default-0" Feb 23 00:22:08 crc kubenswrapper[4735]: E0223 00:22:08.172167 4735 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Feb 23 00:22:08 crc kubenswrapper[4735]: E0223 00:22:08.172286 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ad20943-871d-4331-a0e1-5145da93efee-secret-default-alertmanager-proxy-tls podName:1ad20943-871d-4331-a0e1-5145da93efee nodeName:}" failed. No retries permitted until 2026-02-23 00:22:09.172257821 +0000 UTC m=+887.635803832 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/1ad20943-871d-4331-a0e1-5145da93efee-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "1ad20943-871d-4331-a0e1-5145da93efee") : secret "default-alertmanager-proxy-tls" not found Feb 23 00:22:09 crc kubenswrapper[4735]: I0223 00:22:09.192759 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ad20943-871d-4331-a0e1-5145da93efee-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"1ad20943-871d-4331-a0e1-5145da93efee\") " pod="service-telemetry/alertmanager-default-0" Feb 23 00:22:09 crc kubenswrapper[4735]: E0223 00:22:09.192953 4735 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Feb 23 00:22:09 crc kubenswrapper[4735]: E0223 00:22:09.193034 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ad20943-871d-4331-a0e1-5145da93efee-secret-default-alertmanager-proxy-tls podName:1ad20943-871d-4331-a0e1-5145da93efee nodeName:}" failed. No retries permitted until 2026-02-23 00:22:11.193015781 +0000 UTC m=+889.656561742 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/1ad20943-871d-4331-a0e1-5145da93efee-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "1ad20943-871d-4331-a0e1-5145da93efee") : secret "default-alertmanager-proxy-tls" not found Feb 23 00:22:10 crc kubenswrapper[4735]: I0223 00:22:10.988158 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qxcn4" Feb 23 00:22:10 crc kubenswrapper[4735]: I0223 00:22:10.988481 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qxcn4" Feb 23 00:22:11 crc kubenswrapper[4735]: I0223 00:22:11.040810 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qxcn4" Feb 23 00:22:11 crc kubenswrapper[4735]: I0223 00:22:11.220634 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ad20943-871d-4331-a0e1-5145da93efee-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"1ad20943-871d-4331-a0e1-5145da93efee\") " pod="service-telemetry/alertmanager-default-0" Feb 23 00:22:11 crc kubenswrapper[4735]: I0223 00:22:11.238613 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ad20943-871d-4331-a0e1-5145da93efee-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"1ad20943-871d-4331-a0e1-5145da93efee\") " pod="service-telemetry/alertmanager-default-0" Feb 23 00:22:11 crc kubenswrapper[4735]: I0223 00:22:11.358434 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Feb 23 00:22:12 crc kubenswrapper[4735]: I0223 00:22:12.017250 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qxcn4" Feb 23 00:22:12 crc kubenswrapper[4735]: I0223 00:22:12.082641 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qxcn4"] Feb 23 00:22:13 crc kubenswrapper[4735]: I0223 00:22:13.765567 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Feb 23 00:22:13 crc kubenswrapper[4735]: I0223 00:22:13.955735 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-78bcbbdcff-52mm9" event={"ID":"213ffefc-86ab-4944-a9b5-888a99455d05","Type":"ContainerStarted","Data":"88d5b309f273fe770a7cf92acd509ea0b7c52a81b64ba905cbafa63b47ea76df"} Feb 23 00:22:13 crc kubenswrapper[4735]: I0223 00:22:13.962930 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qxcn4" podUID="59bda72c-8385-4452-9328-a56b1dab7114" containerName="registry-server" containerID="cri-o://6aee6a28e0aeeafac4e04ced615e53bd3c3b2775b386e2b811e18283fcf9f586" gracePeriod=2 Feb 23 00:22:13 crc kubenswrapper[4735]: I0223 00:22:13.963037 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"1ad20943-871d-4331-a0e1-5145da93efee","Type":"ContainerStarted","Data":"08d1a917e58d61278952a554b36889a08b237f0bf81509114fa69a72c41f6302"} Feb 23 00:22:13 crc kubenswrapper[4735]: I0223 00:22:13.988011 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-snmp-webhook-78bcbbdcff-52mm9" podStartSLOduration=2.175438971 podStartE2EDuration="10.987993302s" podCreationTimestamp="2026-02-23 00:22:03 +0000 UTC" firstStartedPulling="2026-02-23 00:22:04.792181476 +0000 UTC m=+883.255727457" lastFinishedPulling="2026-02-23 00:22:13.604735817 +0000 UTC m=+892.068281788" observedRunningTime="2026-02-23 00:22:13.985067571 +0000 UTC m=+892.448613542" watchObservedRunningTime="2026-02-23 00:22:13.987993302 +0000 UTC m=+892.451539283" Feb 23 00:22:14 crc kubenswrapper[4735]: I0223 00:22:14.425587 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qxcn4" Feb 23 00:22:14 crc kubenswrapper[4735]: I0223 00:22:14.466637 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztpff\" (UniqueName: \"kubernetes.io/projected/59bda72c-8385-4452-9328-a56b1dab7114-kube-api-access-ztpff\") pod \"59bda72c-8385-4452-9328-a56b1dab7114\" (UID: \"59bda72c-8385-4452-9328-a56b1dab7114\") " Feb 23 00:22:14 crc kubenswrapper[4735]: I0223 00:22:14.466680 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59bda72c-8385-4452-9328-a56b1dab7114-utilities\") pod \"59bda72c-8385-4452-9328-a56b1dab7114\" (UID: \"59bda72c-8385-4452-9328-a56b1dab7114\") " Feb 23 00:22:14 crc kubenswrapper[4735]: I0223 00:22:14.466698 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59bda72c-8385-4452-9328-a56b1dab7114-catalog-content\") pod \"59bda72c-8385-4452-9328-a56b1dab7114\" (UID: \"59bda72c-8385-4452-9328-a56b1dab7114\") " Feb 23 00:22:14 crc kubenswrapper[4735]: I0223 00:22:14.477226 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59bda72c-8385-4452-9328-a56b1dab7114-utilities" (OuterVolumeSpecName: "utilities") pod "59bda72c-8385-4452-9328-a56b1dab7114" (UID: "59bda72c-8385-4452-9328-a56b1dab7114"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:22:14 crc kubenswrapper[4735]: I0223 00:22:14.483345 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59bda72c-8385-4452-9328-a56b1dab7114-kube-api-access-ztpff" (OuterVolumeSpecName: "kube-api-access-ztpff") pod "59bda72c-8385-4452-9328-a56b1dab7114" (UID: "59bda72c-8385-4452-9328-a56b1dab7114"). InnerVolumeSpecName "kube-api-access-ztpff". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:22:14 crc kubenswrapper[4735]: I0223 00:22:14.521165 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59bda72c-8385-4452-9328-a56b1dab7114-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "59bda72c-8385-4452-9328-a56b1dab7114" (UID: "59bda72c-8385-4452-9328-a56b1dab7114"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:22:14 crc kubenswrapper[4735]: I0223 00:22:14.567566 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztpff\" (UniqueName: \"kubernetes.io/projected/59bda72c-8385-4452-9328-a56b1dab7114-kube-api-access-ztpff\") on node \"crc\" DevicePath \"\"" Feb 23 00:22:14 crc kubenswrapper[4735]: I0223 00:22:14.567604 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59bda72c-8385-4452-9328-a56b1dab7114-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 00:22:14 crc kubenswrapper[4735]: I0223 00:22:14.567617 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59bda72c-8385-4452-9328-a56b1dab7114-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 00:22:14 crc kubenswrapper[4735]: I0223 00:22:14.971294 4735 generic.go:334] "Generic (PLEG): container finished" podID="59bda72c-8385-4452-9328-a56b1dab7114" containerID="6aee6a28e0aeeafac4e04ced615e53bd3c3b2775b386e2b811e18283fcf9f586" exitCode=0 Feb 23 00:22:14 crc kubenswrapper[4735]: I0223 00:22:14.971354 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxcn4" event={"ID":"59bda72c-8385-4452-9328-a56b1dab7114","Type":"ContainerDied","Data":"6aee6a28e0aeeafac4e04ced615e53bd3c3b2775b386e2b811e18283fcf9f586"} Feb 23 00:22:14 crc kubenswrapper[4735]: I0223 00:22:14.971744 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxcn4" event={"ID":"59bda72c-8385-4452-9328-a56b1dab7114","Type":"ContainerDied","Data":"dc080369d9c3323de22f954883814aa2e794e61bec0b6285ba63f8a3153121cd"} Feb 23 00:22:14 crc kubenswrapper[4735]: I0223 00:22:14.971402 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qxcn4" Feb 23 00:22:14 crc kubenswrapper[4735]: I0223 00:22:14.971765 4735 scope.go:117] "RemoveContainer" containerID="6aee6a28e0aeeafac4e04ced615e53bd3c3b2775b386e2b811e18283fcf9f586" Feb 23 00:22:15 crc kubenswrapper[4735]: I0223 00:22:15.003265 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qxcn4"] Feb 23 00:22:15 crc kubenswrapper[4735]: I0223 00:22:15.007953 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qxcn4"] Feb 23 00:22:15 crc kubenswrapper[4735]: I0223 00:22:15.979754 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"1ad20943-871d-4331-a0e1-5145da93efee","Type":"ContainerStarted","Data":"a1e3beede127d3cdb42df0abe9c15f617e6135113b82cf404b29d1a5409153c8"} Feb 23 00:22:16 crc kubenswrapper[4735]: I0223 00:22:16.280353 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59bda72c-8385-4452-9328-a56b1dab7114" path="/var/lib/kubelet/pods/59bda72c-8385-4452-9328-a56b1dab7114/volumes" Feb 23 00:22:16 crc kubenswrapper[4735]: I0223 00:22:16.943412 4735 scope.go:117] "RemoveContainer" containerID="77c8085ce9267ce497e861a970738a5f38ffc7d7372048d4c8ded814a4907b7f" Feb 23 00:22:17 crc kubenswrapper[4735]: I0223 00:22:17.263241 4735 scope.go:117] "RemoveContainer" containerID="2321d7b190996a1df010f0e73049d3b87f9da30344f1f3f6c90b9fcf810bc9a0" Feb 23 00:22:17 crc kubenswrapper[4735]: I0223 00:22:17.395142 4735 scope.go:117] "RemoveContainer" containerID="6aee6a28e0aeeafac4e04ced615e53bd3c3b2775b386e2b811e18283fcf9f586" Feb 23 00:22:17 crc kubenswrapper[4735]: E0223 00:22:17.395484 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6aee6a28e0aeeafac4e04ced615e53bd3c3b2775b386e2b811e18283fcf9f586\": container with ID starting with 6aee6a28e0aeeafac4e04ced615e53bd3c3b2775b386e2b811e18283fcf9f586 not found: ID does not exist" containerID="6aee6a28e0aeeafac4e04ced615e53bd3c3b2775b386e2b811e18283fcf9f586" Feb 23 00:22:17 crc kubenswrapper[4735]: I0223 00:22:17.395512 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aee6a28e0aeeafac4e04ced615e53bd3c3b2775b386e2b811e18283fcf9f586"} err="failed to get container status \"6aee6a28e0aeeafac4e04ced615e53bd3c3b2775b386e2b811e18283fcf9f586\": rpc error: code = NotFound desc = could not find container \"6aee6a28e0aeeafac4e04ced615e53bd3c3b2775b386e2b811e18283fcf9f586\": container with ID starting with 6aee6a28e0aeeafac4e04ced615e53bd3c3b2775b386e2b811e18283fcf9f586 not found: ID does not exist" Feb 23 00:22:17 crc kubenswrapper[4735]: I0223 00:22:17.395530 4735 scope.go:117] "RemoveContainer" containerID="77c8085ce9267ce497e861a970738a5f38ffc7d7372048d4c8ded814a4907b7f" Feb 23 00:22:17 crc kubenswrapper[4735]: E0223 00:22:17.395896 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77c8085ce9267ce497e861a970738a5f38ffc7d7372048d4c8ded814a4907b7f\": container with ID starting with 77c8085ce9267ce497e861a970738a5f38ffc7d7372048d4c8ded814a4907b7f not found: ID does not exist" containerID="77c8085ce9267ce497e861a970738a5f38ffc7d7372048d4c8ded814a4907b7f" Feb 23 00:22:17 crc kubenswrapper[4735]: I0223 00:22:17.395919 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77c8085ce9267ce497e861a970738a5f38ffc7d7372048d4c8ded814a4907b7f"} err="failed to get container status \"77c8085ce9267ce497e861a970738a5f38ffc7d7372048d4c8ded814a4907b7f\": rpc error: code = NotFound desc = could not find container \"77c8085ce9267ce497e861a970738a5f38ffc7d7372048d4c8ded814a4907b7f\": container with ID starting with 77c8085ce9267ce497e861a970738a5f38ffc7d7372048d4c8ded814a4907b7f not found: ID does not exist" Feb 23 00:22:17 crc kubenswrapper[4735]: I0223 00:22:17.395931 4735 scope.go:117] "RemoveContainer" containerID="2321d7b190996a1df010f0e73049d3b87f9da30344f1f3f6c90b9fcf810bc9a0" Feb 23 00:22:17 crc kubenswrapper[4735]: E0223 00:22:17.396114 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2321d7b190996a1df010f0e73049d3b87f9da30344f1f3f6c90b9fcf810bc9a0\": container with ID starting with 2321d7b190996a1df010f0e73049d3b87f9da30344f1f3f6c90b9fcf810bc9a0 not found: ID does not exist" containerID="2321d7b190996a1df010f0e73049d3b87f9da30344f1f3f6c90b9fcf810bc9a0" Feb 23 00:22:17 crc kubenswrapper[4735]: I0223 00:22:17.396133 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2321d7b190996a1df010f0e73049d3b87f9da30344f1f3f6c90b9fcf810bc9a0"} err="failed to get container status \"2321d7b190996a1df010f0e73049d3b87f9da30344f1f3f6c90b9fcf810bc9a0\": rpc error: code = NotFound desc = could not find container \"2321d7b190996a1df010f0e73049d3b87f9da30344f1f3f6c90b9fcf810bc9a0\": container with ID starting with 2321d7b190996a1df010f0e73049d3b87f9da30344f1f3f6c90b9fcf810bc9a0 not found: ID does not exist" Feb 23 00:22:17 crc kubenswrapper[4735]: I0223 00:22:17.998336 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"f1292e6a-8c09-4732-8766-c0ebcefbde0a","Type":"ContainerStarted","Data":"1faf125676e3d200eddfdc8d851a8d43320d7161e2d23a508cdc9488e58a195b"} Feb 23 00:22:20 crc kubenswrapper[4735]: I0223 00:22:20.738571 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq"] Feb 23 00:22:20 crc kubenswrapper[4735]: E0223 00:22:20.739137 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59bda72c-8385-4452-9328-a56b1dab7114" containerName="registry-server" Feb 23 00:22:20 crc kubenswrapper[4735]: I0223 00:22:20.739154 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="59bda72c-8385-4452-9328-a56b1dab7114" containerName="registry-server" Feb 23 00:22:20 crc kubenswrapper[4735]: E0223 00:22:20.739171 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59bda72c-8385-4452-9328-a56b1dab7114" containerName="extract-utilities" Feb 23 00:22:20 crc kubenswrapper[4735]: I0223 00:22:20.739179 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="59bda72c-8385-4452-9328-a56b1dab7114" containerName="extract-utilities" Feb 23 00:22:20 crc kubenswrapper[4735]: E0223 00:22:20.739195 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59bda72c-8385-4452-9328-a56b1dab7114" containerName="extract-content" Feb 23 00:22:20 crc kubenswrapper[4735]: I0223 00:22:20.739204 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="59bda72c-8385-4452-9328-a56b1dab7114" containerName="extract-content" Feb 23 00:22:20 crc kubenswrapper[4735]: I0223 00:22:20.739346 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="59bda72c-8385-4452-9328-a56b1dab7114" containerName="registry-server" Feb 23 00:22:20 crc kubenswrapper[4735]: I0223 00:22:20.740421 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq" Feb 23 00:22:20 crc kubenswrapper[4735]: I0223 00:22:20.742687 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-dockercfg-8tjmk" Feb 23 00:22:20 crc kubenswrapper[4735]: I0223 00:22:20.743203 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-coll-meter-proxy-tls" Feb 23 00:22:20 crc kubenswrapper[4735]: I0223 00:22:20.743448 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-session-secret" Feb 23 00:22:20 crc kubenswrapper[4735]: I0223 00:22:20.743726 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-meter-sg-core-configmap" Feb 23 00:22:20 crc kubenswrapper[4735]: I0223 00:22:20.755428 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq"] Feb 23 00:22:20 crc kubenswrapper[4735]: I0223 00:22:20.851331 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vb4h\" (UniqueName: \"kubernetes.io/projected/76aa4cee-ada5-4bc9-854f-127390e15fd2-kube-api-access-9vb4h\") pod \"default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq\" (UID: \"76aa4cee-ada5-4bc9-854f-127390e15fd2\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq" Feb 23 00:22:20 crc kubenswrapper[4735]: I0223 00:22:20.851395 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/76aa4cee-ada5-4bc9-854f-127390e15fd2-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq\" (UID: \"76aa4cee-ada5-4bc9-854f-127390e15fd2\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq" Feb 23 00:22:20 crc kubenswrapper[4735]: I0223 00:22:20.851419 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/76aa4cee-ada5-4bc9-854f-127390e15fd2-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq\" (UID: \"76aa4cee-ada5-4bc9-854f-127390e15fd2\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq" Feb 23 00:22:20 crc kubenswrapper[4735]: I0223 00:22:20.851445 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/76aa4cee-ada5-4bc9-854f-127390e15fd2-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq\" (UID: \"76aa4cee-ada5-4bc9-854f-127390e15fd2\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq" Feb 23 00:22:20 crc kubenswrapper[4735]: I0223 00:22:20.851468 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/76aa4cee-ada5-4bc9-854f-127390e15fd2-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq\" (UID: \"76aa4cee-ada5-4bc9-854f-127390e15fd2\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq" Feb 23 00:22:20 crc kubenswrapper[4735]: I0223 00:22:20.953631 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/76aa4cee-ada5-4bc9-854f-127390e15fd2-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq\" (UID: \"76aa4cee-ada5-4bc9-854f-127390e15fd2\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq" Feb 23 00:22:20 crc kubenswrapper[4735]: I0223 00:22:20.953769 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/76aa4cee-ada5-4bc9-854f-127390e15fd2-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq\" (UID: \"76aa4cee-ada5-4bc9-854f-127390e15fd2\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq" Feb 23 00:22:20 crc kubenswrapper[4735]: I0223 00:22:20.953856 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/76aa4cee-ada5-4bc9-854f-127390e15fd2-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq\" (UID: \"76aa4cee-ada5-4bc9-854f-127390e15fd2\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq" Feb 23 00:22:20 crc kubenswrapper[4735]: I0223 00:22:20.953903 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/76aa4cee-ada5-4bc9-854f-127390e15fd2-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq\" (UID: \"76aa4cee-ada5-4bc9-854f-127390e15fd2\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq" Feb 23 00:22:20 crc kubenswrapper[4735]: E0223 00:22:20.953962 4735 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Feb 23 00:22:20 crc kubenswrapper[4735]: I0223 00:22:20.953977 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vb4h\" (UniqueName: \"kubernetes.io/projected/76aa4cee-ada5-4bc9-854f-127390e15fd2-kube-api-access-9vb4h\") pod \"default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq\" (UID: \"76aa4cee-ada5-4bc9-854f-127390e15fd2\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq" Feb 23 00:22:20 crc kubenswrapper[4735]: E0223 00:22:20.954033 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76aa4cee-ada5-4bc9-854f-127390e15fd2-default-cloud1-coll-meter-proxy-tls podName:76aa4cee-ada5-4bc9-854f-127390e15fd2 nodeName:}" failed. No retries permitted until 2026-02-23 00:22:21.454009839 +0000 UTC m=+899.917555810 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/76aa4cee-ada5-4bc9-854f-127390e15fd2-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq" (UID: "76aa4cee-ada5-4bc9-854f-127390e15fd2") : secret "default-cloud1-coll-meter-proxy-tls" not found Feb 23 00:22:20 crc kubenswrapper[4735]: I0223 00:22:20.954824 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/76aa4cee-ada5-4bc9-854f-127390e15fd2-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq\" (UID: \"76aa4cee-ada5-4bc9-854f-127390e15fd2\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq" Feb 23 00:22:20 crc kubenswrapper[4735]: I0223 00:22:20.956479 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/76aa4cee-ada5-4bc9-854f-127390e15fd2-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq\" (UID: \"76aa4cee-ada5-4bc9-854f-127390e15fd2\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq" Feb 23 00:22:20 crc kubenswrapper[4735]: I0223 00:22:20.961784 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/76aa4cee-ada5-4bc9-854f-127390e15fd2-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq\" (UID: \"76aa4cee-ada5-4bc9-854f-127390e15fd2\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq" Feb 23 00:22:20 crc kubenswrapper[4735]: I0223 00:22:20.978449 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vb4h\" (UniqueName: \"kubernetes.io/projected/76aa4cee-ada5-4bc9-854f-127390e15fd2-kube-api-access-9vb4h\") pod \"default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq\" (UID: \"76aa4cee-ada5-4bc9-854f-127390e15fd2\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq" Feb 23 00:22:21 crc kubenswrapper[4735]: I0223 00:22:21.024451 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"f1292e6a-8c09-4732-8766-c0ebcefbde0a","Type":"ContainerStarted","Data":"6bf6ed12ac7aa07379681cdbdecce8fe7764fe4520d8578c98be85057b40c87e"} Feb 23 00:22:21 crc kubenswrapper[4735]: I0223 00:22:21.460725 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/76aa4cee-ada5-4bc9-854f-127390e15fd2-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq\" (UID: \"76aa4cee-ada5-4bc9-854f-127390e15fd2\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq" Feb 23 00:22:21 crc kubenswrapper[4735]: E0223 00:22:21.461034 4735 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Feb 23 00:22:21 crc kubenswrapper[4735]: E0223 00:22:21.461131 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76aa4cee-ada5-4bc9-854f-127390e15fd2-default-cloud1-coll-meter-proxy-tls podName:76aa4cee-ada5-4bc9-854f-127390e15fd2 nodeName:}" failed. No retries permitted until 2026-02-23 00:22:22.461106685 +0000 UTC m=+900.924652686 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/76aa4cee-ada5-4bc9-854f-127390e15fd2-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq" (UID: "76aa4cee-ada5-4bc9-854f-127390e15fd2") : secret "default-cloud1-coll-meter-proxy-tls" not found Feb 23 00:22:22 crc kubenswrapper[4735]: I0223 00:22:22.485049 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/76aa4cee-ada5-4bc9-854f-127390e15fd2-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq\" (UID: \"76aa4cee-ada5-4bc9-854f-127390e15fd2\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq" Feb 23 00:22:22 crc kubenswrapper[4735]: I0223 00:22:22.490945 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/76aa4cee-ada5-4bc9-854f-127390e15fd2-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq\" (UID: \"76aa4cee-ada5-4bc9-854f-127390e15fd2\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq" Feb 23 00:22:22 crc kubenswrapper[4735]: I0223 00:22:22.716096 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-dockercfg-8tjmk" Feb 23 00:22:22 crc kubenswrapper[4735]: I0223 00:22:22.723517 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq" Feb 23 00:22:22 crc kubenswrapper[4735]: I0223 00:22:22.810932 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht"] Feb 23 00:22:22 crc kubenswrapper[4735]: I0223 00:22:22.812338 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht" Feb 23 00:22:22 crc kubenswrapper[4735]: I0223 00:22:22.814656 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-ceil-meter-proxy-tls" Feb 23 00:22:22 crc kubenswrapper[4735]: I0223 00:22:22.814755 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-meter-sg-core-configmap" Feb 23 00:22:22 crc kubenswrapper[4735]: I0223 00:22:22.821095 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht"] Feb 23 00:22:22 crc kubenswrapper[4735]: I0223 00:22:22.899798 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/475fb5a3-ada6-41f2-baff-da287ca512f3-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht\" (UID: \"475fb5a3-ada6-41f2-baff-da287ca512f3\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht" Feb 23 00:22:22 crc kubenswrapper[4735]: I0223 00:22:22.899874 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2qjp\" (UniqueName: \"kubernetes.io/projected/475fb5a3-ada6-41f2-baff-da287ca512f3-kube-api-access-l2qjp\") pod \"default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht\" (UID: \"475fb5a3-ada6-41f2-baff-da287ca512f3\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht" Feb 23 00:22:22 crc kubenswrapper[4735]: I0223 00:22:22.900067 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/475fb5a3-ada6-41f2-baff-da287ca512f3-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht\" (UID: \"475fb5a3-ada6-41f2-baff-da287ca512f3\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht" Feb 23 00:22:22 crc kubenswrapper[4735]: I0223 00:22:22.900141 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/475fb5a3-ada6-41f2-baff-da287ca512f3-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht\" (UID: \"475fb5a3-ada6-41f2-baff-da287ca512f3\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht" Feb 23 00:22:22 crc kubenswrapper[4735]: I0223 00:22:22.900287 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/475fb5a3-ada6-41f2-baff-da287ca512f3-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht\" (UID: \"475fb5a3-ada6-41f2-baff-da287ca512f3\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht" Feb 23 00:22:23 crc kubenswrapper[4735]: I0223 00:22:23.001454 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2qjp\" (UniqueName: \"kubernetes.io/projected/475fb5a3-ada6-41f2-baff-da287ca512f3-kube-api-access-l2qjp\") pod \"default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht\" (UID: \"475fb5a3-ada6-41f2-baff-da287ca512f3\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht" Feb 23 00:22:23 crc kubenswrapper[4735]: I0223 00:22:23.018052 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/475fb5a3-ada6-41f2-baff-da287ca512f3-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht\" (UID: \"475fb5a3-ada6-41f2-baff-da287ca512f3\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht" Feb 23 00:22:23 crc kubenswrapper[4735]: I0223 00:22:23.018439 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/475fb5a3-ada6-41f2-baff-da287ca512f3-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht\" (UID: \"475fb5a3-ada6-41f2-baff-da287ca512f3\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht" Feb 23 00:22:23 crc kubenswrapper[4735]: I0223 00:22:23.018532 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/475fb5a3-ada6-41f2-baff-da287ca512f3-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht\" (UID: \"475fb5a3-ada6-41f2-baff-da287ca512f3\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht" Feb 23 00:22:23 crc kubenswrapper[4735]: I0223 00:22:23.018642 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/475fb5a3-ada6-41f2-baff-da287ca512f3-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht\" (UID: \"475fb5a3-ada6-41f2-baff-da287ca512f3\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht" Feb 23 00:22:23 crc kubenswrapper[4735]: E0223 00:22:23.019090 4735 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Feb 23 00:22:23 crc kubenswrapper[4735]: E0223 00:22:23.019172 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/475fb5a3-ada6-41f2-baff-da287ca512f3-default-cloud1-ceil-meter-proxy-tls podName:475fb5a3-ada6-41f2-baff-da287ca512f3 nodeName:}" failed. No retries permitted until 2026-02-23 00:22:23.519154379 +0000 UTC m=+901.982700350 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/475fb5a3-ada6-41f2-baff-da287ca512f3-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht" (UID: "475fb5a3-ada6-41f2-baff-da287ca512f3") : secret "default-cloud1-ceil-meter-proxy-tls" not found Feb 23 00:22:23 crc kubenswrapper[4735]: I0223 00:22:23.023339 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/475fb5a3-ada6-41f2-baff-da287ca512f3-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht\" (UID: \"475fb5a3-ada6-41f2-baff-da287ca512f3\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht" Feb 23 00:22:23 crc kubenswrapper[4735]: I0223 00:22:23.024014 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/475fb5a3-ada6-41f2-baff-da287ca512f3-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht\" (UID: \"475fb5a3-ada6-41f2-baff-da287ca512f3\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht" Feb 23 00:22:23 crc kubenswrapper[4735]: I0223 00:22:23.039651 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/475fb5a3-ada6-41f2-baff-da287ca512f3-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht\" (UID: \"475fb5a3-ada6-41f2-baff-da287ca512f3\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht" Feb 23 00:22:23 crc kubenswrapper[4735]: I0223 00:22:23.041065 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2qjp\" (UniqueName: \"kubernetes.io/projected/475fb5a3-ada6-41f2-baff-da287ca512f3-kube-api-access-l2qjp\") pod \"default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht\" (UID: \"475fb5a3-ada6-41f2-baff-da287ca512f3\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht" Feb 23 00:22:23 crc kubenswrapper[4735]: I0223 00:22:23.046110 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq"] Feb 23 00:22:23 crc kubenswrapper[4735]: I0223 00:22:23.525461 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/475fb5a3-ada6-41f2-baff-da287ca512f3-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht\" (UID: \"475fb5a3-ada6-41f2-baff-da287ca512f3\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht" Feb 23 00:22:23 crc kubenswrapper[4735]: E0223 00:22:23.525808 4735 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Feb 23 00:22:23 crc kubenswrapper[4735]: E0223 00:22:23.527606 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/475fb5a3-ada6-41f2-baff-da287ca512f3-default-cloud1-ceil-meter-proxy-tls podName:475fb5a3-ada6-41f2-baff-da287ca512f3 nodeName:}" failed. No retries permitted until 2026-02-23 00:22:24.527564856 +0000 UTC m=+902.991110857 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/475fb5a3-ada6-41f2-baff-da287ca512f3-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht" (UID: "475fb5a3-ada6-41f2-baff-da287ca512f3") : secret "default-cloud1-ceil-meter-proxy-tls" not found Feb 23 00:22:24 crc kubenswrapper[4735]: I0223 00:22:24.058780 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq" event={"ID":"76aa4cee-ada5-4bc9-854f-127390e15fd2","Type":"ContainerStarted","Data":"ff911751b60eb76eb0f18d3471dcbda0c4f19f7305a15287a70c430c6d82b8e2"} Feb 23 00:22:24 crc kubenswrapper[4735]: I0223 00:22:24.542936 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/475fb5a3-ada6-41f2-baff-da287ca512f3-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht\" (UID: \"475fb5a3-ada6-41f2-baff-da287ca512f3\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht" Feb 23 00:22:24 crc kubenswrapper[4735]: I0223 00:22:24.549285 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/475fb5a3-ada6-41f2-baff-da287ca512f3-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht\" (UID: \"475fb5a3-ada6-41f2-baff-da287ca512f3\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht" Feb 23 00:22:24 crc kubenswrapper[4735]: I0223 00:22:24.630706 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht" Feb 23 00:22:25 crc kubenswrapper[4735]: I0223 00:22:25.088570 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht"] Feb 23 00:22:25 crc kubenswrapper[4735]: W0223 00:22:25.094031 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod475fb5a3_ada6_41f2_baff_da287ca512f3.slice/crio-c9465d17b4a325f2434da25aa32da339e6e495d7f00ee49fe8da0258d03c9537 WatchSource:0}: Error finding container c9465d17b4a325f2434da25aa32da339e6e495d7f00ee49fe8da0258d03c9537: Status 404 returned error can't find the container with id c9465d17b4a325f2434da25aa32da339e6e495d7f00ee49fe8da0258d03c9537 Feb 23 00:22:26 crc kubenswrapper[4735]: I0223 00:22:26.071381 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht" event={"ID":"475fb5a3-ada6-41f2-baff-da287ca512f3","Type":"ContainerStarted","Data":"c9465d17b4a325f2434da25aa32da339e6e495d7f00ee49fe8da0258d03c9537"} Feb 23 00:22:26 crc kubenswrapper[4735]: I0223 00:22:26.072780 4735 generic.go:334] "Generic (PLEG): container finished" podID="1ad20943-871d-4331-a0e1-5145da93efee" containerID="a1e3beede127d3cdb42df0abe9c15f617e6135113b82cf404b29d1a5409153c8" exitCode=0 Feb 23 00:22:26 crc kubenswrapper[4735]: I0223 00:22:26.072816 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"1ad20943-871d-4331-a0e1-5145da93efee","Type":"ContainerDied","Data":"a1e3beede127d3cdb42df0abe9c15f617e6135113b82cf404b29d1a5409153c8"} Feb 23 00:22:26 crc kubenswrapper[4735]: I0223 00:22:26.702138 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7"] Feb 23 00:22:26 crc kubenswrapper[4735]: I0223 00:22:26.703514 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7" Feb 23 00:22:26 crc kubenswrapper[4735]: I0223 00:22:26.709294 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-sens-meter-sg-core-configmap" Feb 23 00:22:26 crc kubenswrapper[4735]: I0223 00:22:26.709488 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-sens-meter-proxy-tls" Feb 23 00:22:26 crc kubenswrapper[4735]: I0223 00:22:26.713822 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7"] Feb 23 00:22:26 crc kubenswrapper[4735]: I0223 00:22:26.774379 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/6e748a5f-2b4d-4815-9b8f-0429055eae26-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7\" (UID: \"6e748a5f-2b4d-4815-9b8f-0429055eae26\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7" Feb 23 00:22:26 crc kubenswrapper[4735]: I0223 00:22:26.774440 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/6e748a5f-2b4d-4815-9b8f-0429055eae26-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7\" (UID: \"6e748a5f-2b4d-4815-9b8f-0429055eae26\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7" Feb 23 00:22:26 crc kubenswrapper[4735]: I0223 00:22:26.774484 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e748a5f-2b4d-4815-9b8f-0429055eae26-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7\" (UID: \"6e748a5f-2b4d-4815-9b8f-0429055eae26\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7" Feb 23 00:22:26 crc kubenswrapper[4735]: I0223 00:22:26.774554 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xnn8\" (UniqueName: \"kubernetes.io/projected/6e748a5f-2b4d-4815-9b8f-0429055eae26-kube-api-access-4xnn8\") pod \"default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7\" (UID: \"6e748a5f-2b4d-4815-9b8f-0429055eae26\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7" Feb 23 00:22:26 crc kubenswrapper[4735]: I0223 00:22:26.774585 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/6e748a5f-2b4d-4815-9b8f-0429055eae26-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7\" (UID: \"6e748a5f-2b4d-4815-9b8f-0429055eae26\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7" Feb 23 00:22:26 crc kubenswrapper[4735]: I0223 00:22:26.876032 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xnn8\" (UniqueName: \"kubernetes.io/projected/6e748a5f-2b4d-4815-9b8f-0429055eae26-kube-api-access-4xnn8\") pod \"default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7\" (UID: \"6e748a5f-2b4d-4815-9b8f-0429055eae26\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7" Feb 23 00:22:26 crc kubenswrapper[4735]: I0223 00:22:26.876079 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/6e748a5f-2b4d-4815-9b8f-0429055eae26-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7\" (UID: \"6e748a5f-2b4d-4815-9b8f-0429055eae26\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7" Feb 23 00:22:26 crc kubenswrapper[4735]: I0223 00:22:26.876125 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/6e748a5f-2b4d-4815-9b8f-0429055eae26-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7\" (UID: \"6e748a5f-2b4d-4815-9b8f-0429055eae26\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7" Feb 23 00:22:26 crc kubenswrapper[4735]: I0223 00:22:26.876162 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/6e748a5f-2b4d-4815-9b8f-0429055eae26-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7\" (UID: \"6e748a5f-2b4d-4815-9b8f-0429055eae26\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7" Feb 23 00:22:26 crc kubenswrapper[4735]: I0223 00:22:26.876188 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e748a5f-2b4d-4815-9b8f-0429055eae26-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7\" (UID: \"6e748a5f-2b4d-4815-9b8f-0429055eae26\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7" Feb 23 00:22:26 crc kubenswrapper[4735]: E0223 00:22:26.876273 4735 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Feb 23 00:22:26 crc kubenswrapper[4735]: E0223 00:22:26.876313 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e748a5f-2b4d-4815-9b8f-0429055eae26-default-cloud1-sens-meter-proxy-tls podName:6e748a5f-2b4d-4815-9b8f-0429055eae26 nodeName:}" failed. No retries permitted until 2026-02-23 00:22:27.376299031 +0000 UTC m=+905.839845002 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/6e748a5f-2b4d-4815-9b8f-0429055eae26-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7" (UID: "6e748a5f-2b4d-4815-9b8f-0429055eae26") : secret "default-cloud1-sens-meter-proxy-tls" not found Feb 23 00:22:26 crc kubenswrapper[4735]: I0223 00:22:26.876573 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/6e748a5f-2b4d-4815-9b8f-0429055eae26-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7\" (UID: \"6e748a5f-2b4d-4815-9b8f-0429055eae26\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7" Feb 23 00:22:26 crc kubenswrapper[4735]: I0223 00:22:26.876996 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/6e748a5f-2b4d-4815-9b8f-0429055eae26-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7\" (UID: \"6e748a5f-2b4d-4815-9b8f-0429055eae26\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7" Feb 23 00:22:26 crc kubenswrapper[4735]: I0223 00:22:26.881999 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/6e748a5f-2b4d-4815-9b8f-0429055eae26-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7\" (UID: \"6e748a5f-2b4d-4815-9b8f-0429055eae26\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7" Feb 23 00:22:26 crc kubenswrapper[4735]: I0223 00:22:26.893651 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xnn8\" (UniqueName: \"kubernetes.io/projected/6e748a5f-2b4d-4815-9b8f-0429055eae26-kube-api-access-4xnn8\") pod \"default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7\" (UID: \"6e748a5f-2b4d-4815-9b8f-0429055eae26\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7" Feb 23 00:22:27 crc kubenswrapper[4735]: I0223 00:22:27.383773 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e748a5f-2b4d-4815-9b8f-0429055eae26-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7\" (UID: \"6e748a5f-2b4d-4815-9b8f-0429055eae26\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7" Feb 23 00:22:27 crc kubenswrapper[4735]: E0223 00:22:27.383973 4735 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Feb 23 00:22:27 crc kubenswrapper[4735]: E0223 00:22:27.384029 4735 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e748a5f-2b4d-4815-9b8f-0429055eae26-default-cloud1-sens-meter-proxy-tls podName:6e748a5f-2b4d-4815-9b8f-0429055eae26 nodeName:}" failed. No retries permitted until 2026-02-23 00:22:28.384010761 +0000 UTC m=+906.847556732 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/6e748a5f-2b4d-4815-9b8f-0429055eae26-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7" (UID: "6e748a5f-2b4d-4815-9b8f-0429055eae26") : secret "default-cloud1-sens-meter-proxy-tls" not found Feb 23 00:22:28 crc kubenswrapper[4735]: I0223 00:22:28.398051 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e748a5f-2b4d-4815-9b8f-0429055eae26-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7\" (UID: \"6e748a5f-2b4d-4815-9b8f-0429055eae26\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7" Feb 23 00:22:28 crc kubenswrapper[4735]: I0223 00:22:28.405332 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e748a5f-2b4d-4815-9b8f-0429055eae26-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7\" (UID: \"6e748a5f-2b4d-4815-9b8f-0429055eae26\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7" Feb 23 00:22:28 crc kubenswrapper[4735]: I0223 00:22:28.520074 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7" Feb 23 00:22:30 crc kubenswrapper[4735]: I0223 00:22:30.955302 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7"] Feb 23 00:22:31 crc kubenswrapper[4735]: I0223 00:22:31.111278 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht" event={"ID":"475fb5a3-ada6-41f2-baff-da287ca512f3","Type":"ContainerStarted","Data":"a6d9958b62bb564c542a9e27cc46a19c4738c94628a770e3cdf8fa88bf70766b"} Feb 23 00:22:31 crc kubenswrapper[4735]: I0223 00:22:31.114423 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"f1292e6a-8c09-4732-8766-c0ebcefbde0a","Type":"ContainerStarted","Data":"ba94d268d1cdeb80b3f7f26a435e2fd3401cbff78e02f5c7b13896ded14263c2"} Feb 23 00:22:31 crc kubenswrapper[4735]: I0223 00:22:31.155621 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-default-0" podStartSLOduration=3.960480543 podStartE2EDuration="38.155607168s" podCreationTimestamp="2026-02-23 00:21:53 +0000 UTC" firstStartedPulling="2026-02-23 00:21:56.446904787 +0000 UTC m=+874.910450758" lastFinishedPulling="2026-02-23 00:22:30.642031412 +0000 UTC m=+909.105577383" observedRunningTime="2026-02-23 00:22:31.155381532 +0000 UTC m=+909.618927503" watchObservedRunningTime="2026-02-23 00:22:31.155607168 +0000 UTC m=+909.619153139" Feb 23 00:22:31 crc kubenswrapper[4735]: W0223 00:22:31.516544 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e748a5f_2b4d_4815_9b8f_0429055eae26.slice/crio-a5f78bda9313a49632900d6592909a42d46e66383136fb9cafaaf10c2102da07 WatchSource:0}: Error finding container a5f78bda9313a49632900d6592909a42d46e66383136fb9cafaaf10c2102da07: Status 404 returned error can't find the container with id a5f78bda9313a49632900d6592909a42d46e66383136fb9cafaaf10c2102da07 Feb 23 00:22:32 crc kubenswrapper[4735]: I0223 00:22:32.127081 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7" event={"ID":"6e748a5f-2b4d-4815-9b8f-0429055eae26","Type":"ContainerStarted","Data":"a5f78bda9313a49632900d6592909a42d46e66383136fb9cafaaf10c2102da07"} Feb 23 00:22:32 crc kubenswrapper[4735]: I0223 00:22:32.129619 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq" event={"ID":"76aa4cee-ada5-4bc9-854f-127390e15fd2","Type":"ContainerStarted","Data":"48897407b149ab53aaae487249eaf18cd2504ee92c14bcf5a47700f942a5e215"} Feb 23 00:22:33 crc kubenswrapper[4735]: I0223 00:22:33.136886 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"1ad20943-871d-4331-a0e1-5145da93efee","Type":"ContainerStarted","Data":"4ecd246665e903d35db1f0c4928fb517a181be2a52e329351c73bb60f883fa5d"} Feb 23 00:22:33 crc kubenswrapper[4735]: I0223 00:22:33.138391 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7" event={"ID":"6e748a5f-2b4d-4815-9b8f-0429055eae26","Type":"ContainerStarted","Data":"747e74cab237c483ba1228619cc67ca7bab9230917ec213cf047f92341f19074"} Feb 23 00:22:33 crc kubenswrapper[4735]: I0223 00:22:33.482289 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t"] Feb 23 00:22:33 crc kubenswrapper[4735]: I0223 00:22:33.483447 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t" Feb 23 00:22:33 crc kubenswrapper[4735]: I0223 00:22:33.485498 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-cert" Feb 23 00:22:33 crc kubenswrapper[4735]: I0223 00:22:33.486702 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-event-sg-core-configmap" Feb 23 00:22:33 crc kubenswrapper[4735]: I0223 00:22:33.503180 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t"] Feb 23 00:22:33 crc kubenswrapper[4735]: I0223 00:22:33.669539 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t\" (UID: \"1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t" Feb 23 00:22:33 crc kubenswrapper[4735]: I0223 00:22:33.669635 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t\" (UID: \"1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t" Feb 23 00:22:33 crc kubenswrapper[4735]: I0223 00:22:33.669667 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t\" (UID: \"1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t" Feb 23 00:22:33 crc kubenswrapper[4735]: I0223 00:22:33.669889 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mjxs\" (UniqueName: \"kubernetes.io/projected/1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d-kube-api-access-4mjxs\") pod \"default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t\" (UID: \"1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t" Feb 23 00:22:33 crc kubenswrapper[4735]: I0223 00:22:33.771413 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mjxs\" (UniqueName: \"kubernetes.io/projected/1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d-kube-api-access-4mjxs\") pod \"default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t\" (UID: \"1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t" Feb 23 00:22:33 crc kubenswrapper[4735]: I0223 00:22:33.771465 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t\" (UID: \"1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t" Feb 23 00:22:33 crc kubenswrapper[4735]: I0223 00:22:33.771509 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t\" (UID: \"1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t" Feb 23 00:22:33 crc kubenswrapper[4735]: I0223 00:22:33.771529 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t\" (UID: \"1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t" Feb 23 00:22:33 crc kubenswrapper[4735]: I0223 00:22:33.772013 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t\" (UID: \"1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t" Feb 23 00:22:33 crc kubenswrapper[4735]: I0223 00:22:33.772572 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t\" (UID: \"1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t" Feb 23 00:22:33 crc kubenswrapper[4735]: I0223 00:22:33.787706 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mjxs\" (UniqueName: \"kubernetes.io/projected/1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d-kube-api-access-4mjxs\") pod \"default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t\" (UID: \"1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t" Feb 23 00:22:33 crc kubenswrapper[4735]: I0223 00:22:33.791560 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t\" (UID: \"1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t" Feb 23 00:22:33 crc kubenswrapper[4735]: I0223 00:22:33.796892 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t" Feb 23 00:22:34 crc kubenswrapper[4735]: I0223 00:22:34.150684 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"1ad20943-871d-4331-a0e1-5145da93efee","Type":"ContainerStarted","Data":"bc3151051583f9be7a5c0bca454a574110cd890b4e2abb51a05c2b3736220593"} Feb 23 00:22:34 crc kubenswrapper[4735]: I0223 00:22:34.215364 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t"] Feb 23 00:22:34 crc kubenswrapper[4735]: I0223 00:22:34.559930 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt"] Feb 23 00:22:34 crc kubenswrapper[4735]: I0223 00:22:34.561408 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt" Feb 23 00:22:34 crc kubenswrapper[4735]: I0223 00:22:34.565901 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-event-sg-core-configmap" Feb 23 00:22:34 crc kubenswrapper[4735]: I0223 00:22:34.567525 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt"] Feb 23 00:22:34 crc kubenswrapper[4735]: I0223 00:22:34.686097 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/582cde1e-9aab-4e5f-bad8-f9c2d01a5742-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt\" (UID: \"582cde1e-9aab-4e5f-bad8-f9c2d01a5742\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt" Feb 23 00:22:34 crc kubenswrapper[4735]: I0223 00:22:34.686164 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/582cde1e-9aab-4e5f-bad8-f9c2d01a5742-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt\" (UID: \"582cde1e-9aab-4e5f-bad8-f9c2d01a5742\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt" Feb 23 00:22:34 crc kubenswrapper[4735]: I0223 00:22:34.686194 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2l7p\" (UniqueName: \"kubernetes.io/projected/582cde1e-9aab-4e5f-bad8-f9c2d01a5742-kube-api-access-f2l7p\") pod \"default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt\" (UID: \"582cde1e-9aab-4e5f-bad8-f9c2d01a5742\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt" Feb 23 00:22:34 crc kubenswrapper[4735]: I0223 00:22:34.686245 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/582cde1e-9aab-4e5f-bad8-f9c2d01a5742-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt\" (UID: \"582cde1e-9aab-4e5f-bad8-f9c2d01a5742\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt" Feb 23 00:22:34 crc kubenswrapper[4735]: I0223 00:22:34.787686 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/582cde1e-9aab-4e5f-bad8-f9c2d01a5742-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt\" (UID: \"582cde1e-9aab-4e5f-bad8-f9c2d01a5742\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt" Feb 23 00:22:34 crc kubenswrapper[4735]: I0223 00:22:34.787794 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/582cde1e-9aab-4e5f-bad8-f9c2d01a5742-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt\" (UID: \"582cde1e-9aab-4e5f-bad8-f9c2d01a5742\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt" Feb 23 00:22:34 crc kubenswrapper[4735]: I0223 00:22:34.787830 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/582cde1e-9aab-4e5f-bad8-f9c2d01a5742-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt\" (UID: \"582cde1e-9aab-4e5f-bad8-f9c2d01a5742\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt" Feb 23 00:22:34 crc kubenswrapper[4735]: I0223 00:22:34.787861 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2l7p\" (UniqueName: \"kubernetes.io/projected/582cde1e-9aab-4e5f-bad8-f9c2d01a5742-kube-api-access-f2l7p\") pod \"default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt\" (UID: \"582cde1e-9aab-4e5f-bad8-f9c2d01a5742\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt" Feb 23 00:22:34 crc kubenswrapper[4735]: I0223 00:22:34.788307 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/582cde1e-9aab-4e5f-bad8-f9c2d01a5742-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt\" (UID: \"582cde1e-9aab-4e5f-bad8-f9c2d01a5742\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt" Feb 23 00:22:34 crc kubenswrapper[4735]: I0223 00:22:34.788791 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/582cde1e-9aab-4e5f-bad8-f9c2d01a5742-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt\" (UID: \"582cde1e-9aab-4e5f-bad8-f9c2d01a5742\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt" Feb 23 00:22:34 crc kubenswrapper[4735]: I0223 00:22:34.794335 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/582cde1e-9aab-4e5f-bad8-f9c2d01a5742-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt\" (UID: \"582cde1e-9aab-4e5f-bad8-f9c2d01a5742\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt" Feb 23 00:22:34 crc kubenswrapper[4735]: I0223 00:22:34.804086 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2l7p\" (UniqueName: \"kubernetes.io/projected/582cde1e-9aab-4e5f-bad8-f9c2d01a5742-kube-api-access-f2l7p\") pod \"default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt\" (UID: \"582cde1e-9aab-4e5f-bad8-f9c2d01a5742\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt" Feb 23 00:22:34 crc kubenswrapper[4735]: I0223 00:22:34.893436 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt" Feb 23 00:22:35 crc kubenswrapper[4735]: I0223 00:22:35.157740 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t" event={"ID":"1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d","Type":"ContainerStarted","Data":"f3011ff477e327a71e1d24e9412059c289d3ae7d9deac3f47edae4199de7b108"} Feb 23 00:22:35 crc kubenswrapper[4735]: I0223 00:22:35.160900 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"1ad20943-871d-4331-a0e1-5145da93efee","Type":"ContainerStarted","Data":"e750397b884a0e97060427710c992f7bc25658ee50e86335a2bb2b964910ccf4"} Feb 23 00:22:35 crc kubenswrapper[4735]: I0223 00:22:35.183327 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/alertmanager-default-0" podStartSLOduration=20.777218674 podStartE2EDuration="29.183313274s" podCreationTimestamp="2026-02-23 00:22:06 +0000 UTC" firstStartedPulling="2026-02-23 00:22:26.073922613 +0000 UTC m=+904.537468584" lastFinishedPulling="2026-02-23 00:22:34.480017203 +0000 UTC m=+912.943563184" observedRunningTime="2026-02-23 00:22:35.17734096 +0000 UTC m=+913.640886931" watchObservedRunningTime="2026-02-23 00:22:35.183313274 +0000 UTC m=+913.646859245" Feb 23 00:22:35 crc kubenswrapper[4735]: I0223 00:22:35.989685 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/prometheus-default-0" Feb 23 00:22:38 crc kubenswrapper[4735]: I0223 00:22:38.860935 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt"] Feb 23 00:22:39 crc kubenswrapper[4735]: I0223 00:22:39.190833 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht" event={"ID":"475fb5a3-ada6-41f2-baff-da287ca512f3","Type":"ContainerStarted","Data":"1cf2812e1ed945058fbee818d55aa04bb6c814bfadb59389622b691b53a26140"} Feb 23 00:22:39 crc kubenswrapper[4735]: I0223 00:22:39.193050 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t" event={"ID":"1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d","Type":"ContainerStarted","Data":"ecadce1f935bc60c837b1ca8da7395b50ca813320c3deb488b089bb817c85047"} Feb 23 00:22:39 crc kubenswrapper[4735]: I0223 00:22:39.195303 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq" event={"ID":"76aa4cee-ada5-4bc9-854f-127390e15fd2","Type":"ContainerStarted","Data":"aa37cb41d40124063111fd5d56d06dff62566b59d31a140c961f0c221313aa1a"} Feb 23 00:22:39 crc kubenswrapper[4735]: I0223 00:22:39.197346 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7" event={"ID":"6e748a5f-2b4d-4815-9b8f-0429055eae26","Type":"ContainerStarted","Data":"4973c03aaf4b8b77ed4829067081ea80cc2c6e37cab231be6d7728e060300336"} Feb 23 00:22:39 crc kubenswrapper[4735]: I0223 00:22:39.198793 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt" event={"ID":"582cde1e-9aab-4e5f-bad8-f9c2d01a5742","Type":"ContainerStarted","Data":"afaca3f7cc3a2922dc34fa02d85c095be399c91eea63493c7049cd57efaa68e7"} Feb 23 00:22:40 crc kubenswrapper[4735]: I0223 00:22:40.207603 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt" event={"ID":"582cde1e-9aab-4e5f-bad8-f9c2d01a5742","Type":"ContainerStarted","Data":"dfe844780f201e34e541f5f25d42f6945ee3dc33a7ca2c4ff6f2ef7869c2fa34"} Feb 23 00:22:40 crc kubenswrapper[4735]: I0223 00:22:40.990158 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/prometheus-default-0" Feb 23 00:22:41 crc kubenswrapper[4735]: I0223 00:22:41.030527 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/prometheus-default-0" Feb 23 00:22:41 crc kubenswrapper[4735]: I0223 00:22:41.252990 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/prometheus-default-0" Feb 23 00:22:44 crc kubenswrapper[4735]: I0223 00:22:44.239909 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t" event={"ID":"1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d","Type":"ContainerStarted","Data":"6771b2bd5637832cb903e06f0a6aacfcfeb1a806919119b7e2ed3a27df015542"} Feb 23 00:22:44 crc kubenswrapper[4735]: I0223 00:22:44.242326 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq" event={"ID":"76aa4cee-ada5-4bc9-854f-127390e15fd2","Type":"ContainerStarted","Data":"48f5a2f37c416063bd6ef204cb02d58fa3c22df828601f0719be1c09f5fc3aea"} Feb 23 00:22:44 crc kubenswrapper[4735]: I0223 00:22:44.244239 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7" event={"ID":"6e748a5f-2b4d-4815-9b8f-0429055eae26","Type":"ContainerStarted","Data":"c80c95dd3364bb314a42837259cb574c8d566ae97ec02a906bed54fd7cd2a286"} Feb 23 00:22:44 crc kubenswrapper[4735]: I0223 00:22:44.245515 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt" event={"ID":"582cde1e-9aab-4e5f-bad8-f9c2d01a5742","Type":"ContainerStarted","Data":"ee8944541596e3849aa2a0c4b19b7f0613b3a8d352b2d6b7e4e72911884724ec"} Feb 23 00:22:44 crc kubenswrapper[4735]: I0223 00:22:44.247111 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht" event={"ID":"475fb5a3-ada6-41f2-baff-da287ca512f3","Type":"ContainerStarted","Data":"b839f15a4d26043531b83fd0da33069210c9e94b1f3d35973889c0bd2adf5a89"} Feb 23 00:22:44 crc kubenswrapper[4735]: I0223 00:22:44.261239 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t" podStartSLOduration=1.692844137 podStartE2EDuration="11.261220186s" podCreationTimestamp="2026-02-23 00:22:33 +0000 UTC" firstStartedPulling="2026-02-23 00:22:34.233080682 +0000 UTC m=+912.696626653" lastFinishedPulling="2026-02-23 00:22:43.801456731 +0000 UTC m=+922.265002702" observedRunningTime="2026-02-23 00:22:44.258412418 +0000 UTC m=+922.721958389" watchObservedRunningTime="2026-02-23 00:22:44.261220186 +0000 UTC m=+922.724766157" Feb 23 00:22:44 crc kubenswrapper[4735]: I0223 00:22:44.281639 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7" podStartSLOduration=6.509976409 podStartE2EDuration="18.281621959s" podCreationTimestamp="2026-02-23 00:22:26 +0000 UTC" firstStartedPulling="2026-02-23 00:22:32.079984594 +0000 UTC m=+910.543530565" lastFinishedPulling="2026-02-23 00:22:43.851630144 +0000 UTC m=+922.315176115" observedRunningTime="2026-02-23 00:22:44.281513826 +0000 UTC m=+922.745059797" watchObservedRunningTime="2026-02-23 00:22:44.281621959 +0000 UTC m=+922.745167930" Feb 23 00:22:44 crc kubenswrapper[4735]: I0223 00:22:44.314372 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht" podStartSLOduration=3.503191937 podStartE2EDuration="22.314352981s" podCreationTimestamp="2026-02-23 00:22:22 +0000 UTC" firstStartedPulling="2026-02-23 00:22:25.098102013 +0000 UTC m=+903.561647984" lastFinishedPulling="2026-02-23 00:22:43.909263057 +0000 UTC m=+922.372809028" observedRunningTime="2026-02-23 00:22:44.308396896 +0000 UTC m=+922.771942867" watchObservedRunningTime="2026-02-23 00:22:44.314352981 +0000 UTC m=+922.777898952" Feb 23 00:22:44 crc kubenswrapper[4735]: I0223 00:22:44.333309 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq" podStartSLOduration=3.515726794 podStartE2EDuration="24.333287928s" podCreationTimestamp="2026-02-23 00:22:20 +0000 UTC" firstStartedPulling="2026-02-23 00:22:23.040281561 +0000 UTC m=+901.503827532" lastFinishedPulling="2026-02-23 00:22:43.857842695 +0000 UTC m=+922.321388666" observedRunningTime="2026-02-23 00:22:44.326425582 +0000 UTC m=+922.789971563" watchObservedRunningTime="2026-02-23 00:22:44.333287928 +0000 UTC m=+922.796833899" Feb 23 00:22:44 crc kubenswrapper[4735]: I0223 00:22:44.348535 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt" podStartSLOduration=5.446419211 podStartE2EDuration="10.348520746s" podCreationTimestamp="2026-02-23 00:22:34 +0000 UTC" firstStartedPulling="2026-02-23 00:22:38.867126607 +0000 UTC m=+917.330672578" lastFinishedPulling="2026-02-23 00:22:43.769228142 +0000 UTC m=+922.232774113" observedRunningTime="2026-02-23 00:22:44.34534277 +0000 UTC m=+922.808888741" watchObservedRunningTime="2026-02-23 00:22:44.348520746 +0000 UTC m=+922.812066717" Feb 23 00:22:45 crc kubenswrapper[4735]: I0223 00:22:45.068833 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-ss9cq"] Feb 23 00:22:45 crc kubenswrapper[4735]: I0223 00:22:45.069149 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/default-interconnect-68864d46cb-ss9cq" podUID="a566807c-75ea-4b30-b9bd-dd211d0eca22" containerName="default-interconnect" containerID="cri-o://6150444994c5ebba54b86903649f02f601d756db603f84578e777f37bc678ca5" gracePeriod=30 Feb 23 00:22:45 crc kubenswrapper[4735]: I0223 00:22:45.260656 4735 generic.go:334] "Generic (PLEG): container finished" podID="a566807c-75ea-4b30-b9bd-dd211d0eca22" containerID="6150444994c5ebba54b86903649f02f601d756db603f84578e777f37bc678ca5" exitCode=0 Feb 23 00:22:45 crc kubenswrapper[4735]: I0223 00:22:45.260823 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-ss9cq" event={"ID":"a566807c-75ea-4b30-b9bd-dd211d0eca22","Type":"ContainerDied","Data":"6150444994c5ebba54b86903649f02f601d756db603f84578e777f37bc678ca5"} Feb 23 00:22:45 crc kubenswrapper[4735]: I0223 00:22:45.469145 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-ss9cq" Feb 23 00:22:45 crc kubenswrapper[4735]: I0223 00:22:45.577728 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/a566807c-75ea-4b30-b9bd-dd211d0eca22-default-interconnect-openstack-ca\") pod \"a566807c-75ea-4b30-b9bd-dd211d0eca22\" (UID: \"a566807c-75ea-4b30-b9bd-dd211d0eca22\") " Feb 23 00:22:45 crc kubenswrapper[4735]: I0223 00:22:45.577774 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/a566807c-75ea-4b30-b9bd-dd211d0eca22-default-interconnect-inter-router-credentials\") pod \"a566807c-75ea-4b30-b9bd-dd211d0eca22\" (UID: \"a566807c-75ea-4b30-b9bd-dd211d0eca22\") " Feb 23 00:22:45 crc kubenswrapper[4735]: I0223 00:22:45.577801 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54vtx\" (UniqueName: \"kubernetes.io/projected/a566807c-75ea-4b30-b9bd-dd211d0eca22-kube-api-access-54vtx\") pod \"a566807c-75ea-4b30-b9bd-dd211d0eca22\" (UID: \"a566807c-75ea-4b30-b9bd-dd211d0eca22\") " Feb 23 00:22:45 crc kubenswrapper[4735]: I0223 00:22:45.577827 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/a566807c-75ea-4b30-b9bd-dd211d0eca22-sasl-config\") pod \"a566807c-75ea-4b30-b9bd-dd211d0eca22\" (UID: \"a566807c-75ea-4b30-b9bd-dd211d0eca22\") " Feb 23 00:22:45 crc kubenswrapper[4735]: I0223 00:22:45.577931 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/a566807c-75ea-4b30-b9bd-dd211d0eca22-sasl-users\") pod \"a566807c-75ea-4b30-b9bd-dd211d0eca22\" (UID: \"a566807c-75ea-4b30-b9bd-dd211d0eca22\") " Feb 23 00:22:45 crc kubenswrapper[4735]: I0223 00:22:45.577990 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/a566807c-75ea-4b30-b9bd-dd211d0eca22-default-interconnect-openstack-credentials\") pod \"a566807c-75ea-4b30-b9bd-dd211d0eca22\" (UID: \"a566807c-75ea-4b30-b9bd-dd211d0eca22\") " Feb 23 00:22:45 crc kubenswrapper[4735]: I0223 00:22:45.578044 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/a566807c-75ea-4b30-b9bd-dd211d0eca22-default-interconnect-inter-router-ca\") pod \"a566807c-75ea-4b30-b9bd-dd211d0eca22\" (UID: \"a566807c-75ea-4b30-b9bd-dd211d0eca22\") " Feb 23 00:22:45 crc kubenswrapper[4735]: I0223 00:22:45.580320 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a566807c-75ea-4b30-b9bd-dd211d0eca22-sasl-config" (OuterVolumeSpecName: "sasl-config") pod "a566807c-75ea-4b30-b9bd-dd211d0eca22" (UID: "a566807c-75ea-4b30-b9bd-dd211d0eca22"). InnerVolumeSpecName "sasl-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:22:45 crc kubenswrapper[4735]: I0223 00:22:45.584742 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a566807c-75ea-4b30-b9bd-dd211d0eca22-default-interconnect-openstack-credentials" (OuterVolumeSpecName: "default-interconnect-openstack-credentials") pod "a566807c-75ea-4b30-b9bd-dd211d0eca22" (UID: "a566807c-75ea-4b30-b9bd-dd211d0eca22"). InnerVolumeSpecName "default-interconnect-openstack-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:22:45 crc kubenswrapper[4735]: I0223 00:22:45.589115 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a566807c-75ea-4b30-b9bd-dd211d0eca22-default-interconnect-inter-router-ca" (OuterVolumeSpecName: "default-interconnect-inter-router-ca") pod "a566807c-75ea-4b30-b9bd-dd211d0eca22" (UID: "a566807c-75ea-4b30-b9bd-dd211d0eca22"). InnerVolumeSpecName "default-interconnect-inter-router-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:22:45 crc kubenswrapper[4735]: I0223 00:22:45.594045 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a566807c-75ea-4b30-b9bd-dd211d0eca22-kube-api-access-54vtx" (OuterVolumeSpecName: "kube-api-access-54vtx") pod "a566807c-75ea-4b30-b9bd-dd211d0eca22" (UID: "a566807c-75ea-4b30-b9bd-dd211d0eca22"). InnerVolumeSpecName "kube-api-access-54vtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:22:45 crc kubenswrapper[4735]: I0223 00:22:45.595651 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a566807c-75ea-4b30-b9bd-dd211d0eca22-default-interconnect-inter-router-credentials" (OuterVolumeSpecName: "default-interconnect-inter-router-credentials") pod "a566807c-75ea-4b30-b9bd-dd211d0eca22" (UID: "a566807c-75ea-4b30-b9bd-dd211d0eca22"). InnerVolumeSpecName "default-interconnect-inter-router-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:22:45 crc kubenswrapper[4735]: I0223 00:22:45.599322 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a566807c-75ea-4b30-b9bd-dd211d0eca22-default-interconnect-openstack-ca" (OuterVolumeSpecName: "default-interconnect-openstack-ca") pod "a566807c-75ea-4b30-b9bd-dd211d0eca22" (UID: "a566807c-75ea-4b30-b9bd-dd211d0eca22"). InnerVolumeSpecName "default-interconnect-openstack-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:22:45 crc kubenswrapper[4735]: I0223 00:22:45.602507 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a566807c-75ea-4b30-b9bd-dd211d0eca22-sasl-users" (OuterVolumeSpecName: "sasl-users") pod "a566807c-75ea-4b30-b9bd-dd211d0eca22" (UID: "a566807c-75ea-4b30-b9bd-dd211d0eca22"). InnerVolumeSpecName "sasl-users". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:22:45 crc kubenswrapper[4735]: I0223 00:22:45.679435 4735 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/a566807c-75ea-4b30-b9bd-dd211d0eca22-default-interconnect-openstack-ca\") on node \"crc\" DevicePath \"\"" Feb 23 00:22:45 crc kubenswrapper[4735]: I0223 00:22:45.679486 4735 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/a566807c-75ea-4b30-b9bd-dd211d0eca22-default-interconnect-inter-router-credentials\") on node \"crc\" DevicePath \"\"" Feb 23 00:22:45 crc kubenswrapper[4735]: I0223 00:22:45.679506 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54vtx\" (UniqueName: \"kubernetes.io/projected/a566807c-75ea-4b30-b9bd-dd211d0eca22-kube-api-access-54vtx\") on node \"crc\" DevicePath \"\"" Feb 23 00:22:45 crc kubenswrapper[4735]: I0223 00:22:45.679520 4735 reconciler_common.go:293] "Volume detached for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/a566807c-75ea-4b30-b9bd-dd211d0eca22-sasl-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:22:45 crc kubenswrapper[4735]: I0223 00:22:45.679533 4735 reconciler_common.go:293] "Volume detached for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/a566807c-75ea-4b30-b9bd-dd211d0eca22-sasl-users\") on node \"crc\" DevicePath \"\"" Feb 23 00:22:45 crc kubenswrapper[4735]: I0223 00:22:45.679545 4735 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/a566807c-75ea-4b30-b9bd-dd211d0eca22-default-interconnect-openstack-credentials\") on node \"crc\" DevicePath \"\"" Feb 23 00:22:45 crc kubenswrapper[4735]: I0223 00:22:45.679558 4735 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/a566807c-75ea-4b30-b9bd-dd211d0eca22-default-interconnect-inter-router-ca\") on node \"crc\" DevicePath \"\"" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.267323 4735 generic.go:334] "Generic (PLEG): container finished" podID="582cde1e-9aab-4e5f-bad8-f9c2d01a5742" containerID="dfe844780f201e34e541f5f25d42f6945ee3dc33a7ca2c4ff6f2ef7869c2fa34" exitCode=0 Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.267398 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt" event={"ID":"582cde1e-9aab-4e5f-bad8-f9c2d01a5742","Type":"ContainerDied","Data":"dfe844780f201e34e541f5f25d42f6945ee3dc33a7ca2c4ff6f2ef7869c2fa34"} Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.268026 4735 scope.go:117] "RemoveContainer" containerID="dfe844780f201e34e541f5f25d42f6945ee3dc33a7ca2c4ff6f2ef7869c2fa34" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.269839 4735 generic.go:334] "Generic (PLEG): container finished" podID="6e748a5f-2b4d-4815-9b8f-0429055eae26" containerID="4973c03aaf4b8b77ed4829067081ea80cc2c6e37cab231be6d7728e060300336" exitCode=0 Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.269874 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7" event={"ID":"6e748a5f-2b4d-4815-9b8f-0429055eae26","Type":"ContainerDied","Data":"4973c03aaf4b8b77ed4829067081ea80cc2c6e37cab231be6d7728e060300336"} Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.270769 4735 scope.go:117] "RemoveContainer" containerID="4973c03aaf4b8b77ed4829067081ea80cc2c6e37cab231be6d7728e060300336" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.271900 4735 generic.go:334] "Generic (PLEG): container finished" podID="475fb5a3-ada6-41f2-baff-da287ca512f3" containerID="1cf2812e1ed945058fbee818d55aa04bb6c814bfadb59389622b691b53a26140" exitCode=0 Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.273600 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-ss9cq" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.277067 4735 generic.go:334] "Generic (PLEG): container finished" podID="1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d" containerID="ecadce1f935bc60c837b1ca8da7395b50ca813320c3deb488b089bb817c85047" exitCode=0 Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.279727 4735 generic.go:334] "Generic (PLEG): container finished" podID="76aa4cee-ada5-4bc9-854f-127390e15fd2" containerID="aa37cb41d40124063111fd5d56d06dff62566b59d31a140c961f0c221313aa1a" exitCode=0 Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.288136 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht" event={"ID":"475fb5a3-ada6-41f2-baff-da287ca512f3","Type":"ContainerDied","Data":"1cf2812e1ed945058fbee818d55aa04bb6c814bfadb59389622b691b53a26140"} Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.288209 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-ss9cq" event={"ID":"a566807c-75ea-4b30-b9bd-dd211d0eca22","Type":"ContainerDied","Data":"ce9a21a3010c09df1cf440ec770e8c18cc635aa3a5bfcf76f78ef15616894285"} Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.288232 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t" event={"ID":"1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d","Type":"ContainerDied","Data":"ecadce1f935bc60c837b1ca8da7395b50ca813320c3deb488b089bb817c85047"} Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.288246 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq" event={"ID":"76aa4cee-ada5-4bc9-854f-127390e15fd2","Type":"ContainerDied","Data":"aa37cb41d40124063111fd5d56d06dff62566b59d31a140c961f0c221313aa1a"} Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.288921 4735 scope.go:117] "RemoveContainer" containerID="aa37cb41d40124063111fd5d56d06dff62566b59d31a140c961f0c221313aa1a" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.294344 4735 scope.go:117] "RemoveContainer" containerID="6150444994c5ebba54b86903649f02f601d756db603f84578e777f37bc678ca5" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.294919 4735 scope.go:117] "RemoveContainer" containerID="ecadce1f935bc60c837b1ca8da7395b50ca813320c3deb488b089bb817c85047" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.295989 4735 scope.go:117] "RemoveContainer" containerID="1cf2812e1ed945058fbee818d55aa04bb6c814bfadb59389622b691b53a26140" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.395983 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-p5qtk"] Feb 23 00:22:46 crc kubenswrapper[4735]: E0223 00:22:46.396236 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a566807c-75ea-4b30-b9bd-dd211d0eca22" containerName="default-interconnect" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.396251 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a566807c-75ea-4b30-b9bd-dd211d0eca22" containerName="default-interconnect" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.396371 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a566807c-75ea-4b30-b9bd-dd211d0eca22" containerName="default-interconnect" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.397369 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-p5qtk" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.401328 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-ca" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.403181 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-users" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.403538 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-ca" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.407492 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-credentials" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.407705 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-dockercfg-vtq5g" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.407866 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-credentials" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.407974 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-interconnect-sasl-config" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.423985 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-p5qtk"] Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.439034 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-ss9cq"] Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.452912 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-ss9cq"] Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.499262 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/fed58dc1-ad7e-4c98-b914-6c72c64c1367-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-p5qtk\" (UID: \"fed58dc1-ad7e-4c98-b914-6c72c64c1367\") " pod="service-telemetry/default-interconnect-68864d46cb-p5qtk" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.499333 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/fed58dc1-ad7e-4c98-b914-6c72c64c1367-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-p5qtk\" (UID: \"fed58dc1-ad7e-4c98-b914-6c72c64c1367\") " pod="service-telemetry/default-interconnect-68864d46cb-p5qtk" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.499372 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/fed58dc1-ad7e-4c98-b914-6c72c64c1367-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-p5qtk\" (UID: \"fed58dc1-ad7e-4c98-b914-6c72c64c1367\") " pod="service-telemetry/default-interconnect-68864d46cb-p5qtk" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.499420 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f8dj\" (UniqueName: \"kubernetes.io/projected/fed58dc1-ad7e-4c98-b914-6c72c64c1367-kube-api-access-7f8dj\") pod \"default-interconnect-68864d46cb-p5qtk\" (UID: \"fed58dc1-ad7e-4c98-b914-6c72c64c1367\") " pod="service-telemetry/default-interconnect-68864d46cb-p5qtk" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.499454 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/fed58dc1-ad7e-4c98-b914-6c72c64c1367-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-p5qtk\" (UID: \"fed58dc1-ad7e-4c98-b914-6c72c64c1367\") " pod="service-telemetry/default-interconnect-68864d46cb-p5qtk" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.499497 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/fed58dc1-ad7e-4c98-b914-6c72c64c1367-sasl-users\") pod \"default-interconnect-68864d46cb-p5qtk\" (UID: \"fed58dc1-ad7e-4c98-b914-6c72c64c1367\") " pod="service-telemetry/default-interconnect-68864d46cb-p5qtk" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.499917 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/fed58dc1-ad7e-4c98-b914-6c72c64c1367-sasl-config\") pod \"default-interconnect-68864d46cb-p5qtk\" (UID: \"fed58dc1-ad7e-4c98-b914-6c72c64c1367\") " pod="service-telemetry/default-interconnect-68864d46cb-p5qtk" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.601450 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/fed58dc1-ad7e-4c98-b914-6c72c64c1367-sasl-config\") pod \"default-interconnect-68864d46cb-p5qtk\" (UID: \"fed58dc1-ad7e-4c98-b914-6c72c64c1367\") " pod="service-telemetry/default-interconnect-68864d46cb-p5qtk" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.601495 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/fed58dc1-ad7e-4c98-b914-6c72c64c1367-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-p5qtk\" (UID: \"fed58dc1-ad7e-4c98-b914-6c72c64c1367\") " pod="service-telemetry/default-interconnect-68864d46cb-p5qtk" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.601524 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/fed58dc1-ad7e-4c98-b914-6c72c64c1367-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-p5qtk\" (UID: \"fed58dc1-ad7e-4c98-b914-6c72c64c1367\") " pod="service-telemetry/default-interconnect-68864d46cb-p5qtk" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.601544 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/fed58dc1-ad7e-4c98-b914-6c72c64c1367-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-p5qtk\" (UID: \"fed58dc1-ad7e-4c98-b914-6c72c64c1367\") " pod="service-telemetry/default-interconnect-68864d46cb-p5qtk" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.601572 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f8dj\" (UniqueName: \"kubernetes.io/projected/fed58dc1-ad7e-4c98-b914-6c72c64c1367-kube-api-access-7f8dj\") pod \"default-interconnect-68864d46cb-p5qtk\" (UID: \"fed58dc1-ad7e-4c98-b914-6c72c64c1367\") " pod="service-telemetry/default-interconnect-68864d46cb-p5qtk" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.601593 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/fed58dc1-ad7e-4c98-b914-6c72c64c1367-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-p5qtk\" (UID: \"fed58dc1-ad7e-4c98-b914-6c72c64c1367\") " pod="service-telemetry/default-interconnect-68864d46cb-p5qtk" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.601619 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/fed58dc1-ad7e-4c98-b914-6c72c64c1367-sasl-users\") pod \"default-interconnect-68864d46cb-p5qtk\" (UID: \"fed58dc1-ad7e-4c98-b914-6c72c64c1367\") " pod="service-telemetry/default-interconnect-68864d46cb-p5qtk" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.603386 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/fed58dc1-ad7e-4c98-b914-6c72c64c1367-sasl-config\") pod \"default-interconnect-68864d46cb-p5qtk\" (UID: \"fed58dc1-ad7e-4c98-b914-6c72c64c1367\") " pod="service-telemetry/default-interconnect-68864d46cb-p5qtk" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.606614 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/fed58dc1-ad7e-4c98-b914-6c72c64c1367-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-p5qtk\" (UID: \"fed58dc1-ad7e-4c98-b914-6c72c64c1367\") " pod="service-telemetry/default-interconnect-68864d46cb-p5qtk" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.606684 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/fed58dc1-ad7e-4c98-b914-6c72c64c1367-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-p5qtk\" (UID: \"fed58dc1-ad7e-4c98-b914-6c72c64c1367\") " pod="service-telemetry/default-interconnect-68864d46cb-p5qtk" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.607066 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/fed58dc1-ad7e-4c98-b914-6c72c64c1367-sasl-users\") pod \"default-interconnect-68864d46cb-p5qtk\" (UID: \"fed58dc1-ad7e-4c98-b914-6c72c64c1367\") " pod="service-telemetry/default-interconnect-68864d46cb-p5qtk" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.610502 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/fed58dc1-ad7e-4c98-b914-6c72c64c1367-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-p5qtk\" (UID: \"fed58dc1-ad7e-4c98-b914-6c72c64c1367\") " pod="service-telemetry/default-interconnect-68864d46cb-p5qtk" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.611579 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/fed58dc1-ad7e-4c98-b914-6c72c64c1367-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-p5qtk\" (UID: \"fed58dc1-ad7e-4c98-b914-6c72c64c1367\") " pod="service-telemetry/default-interconnect-68864d46cb-p5qtk" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.624402 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f8dj\" (UniqueName: \"kubernetes.io/projected/fed58dc1-ad7e-4c98-b914-6c72c64c1367-kube-api-access-7f8dj\") pod \"default-interconnect-68864d46cb-p5qtk\" (UID: \"fed58dc1-ad7e-4c98-b914-6c72c64c1367\") " pod="service-telemetry/default-interconnect-68864d46cb-p5qtk" Feb 23 00:22:46 crc kubenswrapper[4735]: I0223 00:22:46.728802 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-p5qtk" Feb 23 00:22:47 crc kubenswrapper[4735]: I0223 00:22:47.084221 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-p5qtk"] Feb 23 00:22:47 crc kubenswrapper[4735]: I0223 00:22:47.297458 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7" event={"ID":"6e748a5f-2b4d-4815-9b8f-0429055eae26","Type":"ContainerStarted","Data":"4d96b54908fe5d7163c29805f81e1f903416f08d126fae83a057a52450f00189"} Feb 23 00:22:47 crc kubenswrapper[4735]: I0223 00:22:47.304673 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt" event={"ID":"582cde1e-9aab-4e5f-bad8-f9c2d01a5742","Type":"ContainerStarted","Data":"ba4aa684f42621405b4a8ab25dde3b01df635a7985763c9f3b0656a358b95a1d"} Feb 23 00:22:47 crc kubenswrapper[4735]: I0223 00:22:47.306945 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht" event={"ID":"475fb5a3-ada6-41f2-baff-da287ca512f3","Type":"ContainerStarted","Data":"aa89c957f7e3dfd2e667b75ffc0e14c96484b81bcc30853af62f3ce5d8006de9"} Feb 23 00:22:47 crc kubenswrapper[4735]: I0223 00:22:47.310969 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t" event={"ID":"1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d","Type":"ContainerStarted","Data":"1068d6798f59cc8e435fc8d1b0403486a5a508285ec570a6a73ed0b9896fb19a"} Feb 23 00:22:47 crc kubenswrapper[4735]: I0223 00:22:47.314021 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq" event={"ID":"76aa4cee-ada5-4bc9-854f-127390e15fd2","Type":"ContainerStarted","Data":"635c2840601b87d82aec83b14845d25d8fa475d9f82b665e07acac77bc22d977"} Feb 23 00:22:47 crc kubenswrapper[4735]: I0223 00:22:47.315832 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-p5qtk" event={"ID":"fed58dc1-ad7e-4c98-b914-6c72c64c1367","Type":"ContainerStarted","Data":"1ac5e6926eb806fbd6443f04d3d6aba91edae33351216b4f23bf830cdf44154e"} Feb 23 00:22:47 crc kubenswrapper[4735]: I0223 00:22:47.315869 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-p5qtk" event={"ID":"fed58dc1-ad7e-4c98-b914-6c72c64c1367","Type":"ContainerStarted","Data":"ce361deb2c55ea1225d7f74c0ccdfe0b1d101460846b8830a104f5dedd435652"} Feb 23 00:22:47 crc kubenswrapper[4735]: I0223 00:22:47.413669 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-p5qtk" podStartSLOduration=2.413653014 podStartE2EDuration="2.413653014s" podCreationTimestamp="2026-02-23 00:22:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-23 00:22:47.411772918 +0000 UTC m=+925.875318879" watchObservedRunningTime="2026-02-23 00:22:47.413653014 +0000 UTC m=+925.877198985" Feb 23 00:22:48 crc kubenswrapper[4735]: I0223 00:22:48.280509 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a566807c-75ea-4b30-b9bd-dd211d0eca22" path="/var/lib/kubelet/pods/a566807c-75ea-4b30-b9bd-dd211d0eca22/volumes" Feb 23 00:22:48 crc kubenswrapper[4735]: I0223 00:22:48.323490 4735 generic.go:334] "Generic (PLEG): container finished" podID="1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d" containerID="1068d6798f59cc8e435fc8d1b0403486a5a508285ec570a6a73ed0b9896fb19a" exitCode=0 Feb 23 00:22:48 crc kubenswrapper[4735]: I0223 00:22:48.323574 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t" event={"ID":"1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d","Type":"ContainerDied","Data":"1068d6798f59cc8e435fc8d1b0403486a5a508285ec570a6a73ed0b9896fb19a"} Feb 23 00:22:48 crc kubenswrapper[4735]: I0223 00:22:48.324164 4735 scope.go:117] "RemoveContainer" containerID="ecadce1f935bc60c837b1ca8da7395b50ca813320c3deb488b089bb817c85047" Feb 23 00:22:48 crc kubenswrapper[4735]: I0223 00:22:48.324648 4735 scope.go:117] "RemoveContainer" containerID="1068d6798f59cc8e435fc8d1b0403486a5a508285ec570a6a73ed0b9896fb19a" Feb 23 00:22:48 crc kubenswrapper[4735]: E0223 00:22:48.324874 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t_service-telemetry(1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d)\"" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t" podUID="1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d" Feb 23 00:22:48 crc kubenswrapper[4735]: I0223 00:22:48.331744 4735 generic.go:334] "Generic (PLEG): container finished" podID="76aa4cee-ada5-4bc9-854f-127390e15fd2" containerID="635c2840601b87d82aec83b14845d25d8fa475d9f82b665e07acac77bc22d977" exitCode=0 Feb 23 00:22:48 crc kubenswrapper[4735]: I0223 00:22:48.331815 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq" event={"ID":"76aa4cee-ada5-4bc9-854f-127390e15fd2","Type":"ContainerDied","Data":"635c2840601b87d82aec83b14845d25d8fa475d9f82b665e07acac77bc22d977"} Feb 23 00:22:48 crc kubenswrapper[4735]: I0223 00:22:48.332227 4735 scope.go:117] "RemoveContainer" containerID="635c2840601b87d82aec83b14845d25d8fa475d9f82b665e07acac77bc22d977" Feb 23 00:22:48 crc kubenswrapper[4735]: E0223 00:22:48.332391 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq_service-telemetry(76aa4cee-ada5-4bc9-854f-127390e15fd2)\"" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq" podUID="76aa4cee-ada5-4bc9-854f-127390e15fd2" Feb 23 00:22:48 crc kubenswrapper[4735]: I0223 00:22:48.334699 4735 generic.go:334] "Generic (PLEG): container finished" podID="6e748a5f-2b4d-4815-9b8f-0429055eae26" containerID="4d96b54908fe5d7163c29805f81e1f903416f08d126fae83a057a52450f00189" exitCode=0 Feb 23 00:22:48 crc kubenswrapper[4735]: I0223 00:22:48.334757 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7" event={"ID":"6e748a5f-2b4d-4815-9b8f-0429055eae26","Type":"ContainerDied","Data":"4d96b54908fe5d7163c29805f81e1f903416f08d126fae83a057a52450f00189"} Feb 23 00:22:48 crc kubenswrapper[4735]: I0223 00:22:48.335035 4735 scope.go:117] "RemoveContainer" containerID="4d96b54908fe5d7163c29805f81e1f903416f08d126fae83a057a52450f00189" Feb 23 00:22:48 crc kubenswrapper[4735]: E0223 00:22:48.335170 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7_service-telemetry(6e748a5f-2b4d-4815-9b8f-0429055eae26)\"" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7" podUID="6e748a5f-2b4d-4815-9b8f-0429055eae26" Feb 23 00:22:48 crc kubenswrapper[4735]: I0223 00:22:48.337085 4735 generic.go:334] "Generic (PLEG): container finished" podID="582cde1e-9aab-4e5f-bad8-f9c2d01a5742" containerID="ba4aa684f42621405b4a8ab25dde3b01df635a7985763c9f3b0656a358b95a1d" exitCode=0 Feb 23 00:22:48 crc kubenswrapper[4735]: I0223 00:22:48.337159 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt" event={"ID":"582cde1e-9aab-4e5f-bad8-f9c2d01a5742","Type":"ContainerDied","Data":"ba4aa684f42621405b4a8ab25dde3b01df635a7985763c9f3b0656a358b95a1d"} Feb 23 00:22:48 crc kubenswrapper[4735]: I0223 00:22:48.337725 4735 scope.go:117] "RemoveContainer" containerID="ba4aa684f42621405b4a8ab25dde3b01df635a7985763c9f3b0656a358b95a1d" Feb 23 00:22:48 crc kubenswrapper[4735]: E0223 00:22:48.337993 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt_service-telemetry(582cde1e-9aab-4e5f-bad8-f9c2d01a5742)\"" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt" podUID="582cde1e-9aab-4e5f-bad8-f9c2d01a5742" Feb 23 00:22:48 crc kubenswrapper[4735]: I0223 00:22:48.348026 4735 generic.go:334] "Generic (PLEG): container finished" podID="475fb5a3-ada6-41f2-baff-da287ca512f3" containerID="aa89c957f7e3dfd2e667b75ffc0e14c96484b81bcc30853af62f3ce5d8006de9" exitCode=0 Feb 23 00:22:48 crc kubenswrapper[4735]: I0223 00:22:48.348588 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht" event={"ID":"475fb5a3-ada6-41f2-baff-da287ca512f3","Type":"ContainerDied","Data":"aa89c957f7e3dfd2e667b75ffc0e14c96484b81bcc30853af62f3ce5d8006de9"} Feb 23 00:22:48 crc kubenswrapper[4735]: I0223 00:22:48.348843 4735 scope.go:117] "RemoveContainer" containerID="aa89c957f7e3dfd2e667b75ffc0e14c96484b81bcc30853af62f3ce5d8006de9" Feb 23 00:22:48 crc kubenswrapper[4735]: E0223 00:22:48.349021 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht_service-telemetry(475fb5a3-ada6-41f2-baff-da287ca512f3)\"" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht" podUID="475fb5a3-ada6-41f2-baff-da287ca512f3" Feb 23 00:22:48 crc kubenswrapper[4735]: I0223 00:22:48.366669 4735 scope.go:117] "RemoveContainer" containerID="aa37cb41d40124063111fd5d56d06dff62566b59d31a140c961f0c221313aa1a" Feb 23 00:22:48 crc kubenswrapper[4735]: I0223 00:22:48.400456 4735 scope.go:117] "RemoveContainer" containerID="4973c03aaf4b8b77ed4829067081ea80cc2c6e37cab231be6d7728e060300336" Feb 23 00:22:48 crc kubenswrapper[4735]: I0223 00:22:48.449076 4735 scope.go:117] "RemoveContainer" containerID="dfe844780f201e34e541f5f25d42f6945ee3dc33a7ca2c4ff6f2ef7869c2fa34" Feb 23 00:22:48 crc kubenswrapper[4735]: I0223 00:22:48.487998 4735 scope.go:117] "RemoveContainer" containerID="1cf2812e1ed945058fbee818d55aa04bb6c814bfadb59389622b691b53a26140" Feb 23 00:22:53 crc kubenswrapper[4735]: I0223 00:22:53.753133 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/qdr-test"] Feb 23 00:22:53 crc kubenswrapper[4735]: I0223 00:22:53.754640 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Feb 23 00:22:53 crc kubenswrapper[4735]: I0223 00:22:53.757799 4735 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-selfsigned" Feb 23 00:22:53 crc kubenswrapper[4735]: I0223 00:22:53.760645 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"qdr-test-config" Feb 23 00:22:53 crc kubenswrapper[4735]: I0223 00:22:53.764459 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Feb 23 00:22:53 crc kubenswrapper[4735]: I0223 00:22:53.825081 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/260aad7a-143f-47d5-ab79-5cfd8eb26eb9-qdr-test-config\") pod \"qdr-test\" (UID: \"260aad7a-143f-47d5-ab79-5cfd8eb26eb9\") " pod="service-telemetry/qdr-test" Feb 23 00:22:53 crc kubenswrapper[4735]: I0223 00:22:53.825175 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwdk9\" (UniqueName: \"kubernetes.io/projected/260aad7a-143f-47d5-ab79-5cfd8eb26eb9-kube-api-access-vwdk9\") pod \"qdr-test\" (UID: \"260aad7a-143f-47d5-ab79-5cfd8eb26eb9\") " pod="service-telemetry/qdr-test" Feb 23 00:22:53 crc kubenswrapper[4735]: I0223 00:22:53.825256 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/260aad7a-143f-47d5-ab79-5cfd8eb26eb9-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"260aad7a-143f-47d5-ab79-5cfd8eb26eb9\") " pod="service-telemetry/qdr-test" Feb 23 00:22:53 crc kubenswrapper[4735]: I0223 00:22:53.926656 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/260aad7a-143f-47d5-ab79-5cfd8eb26eb9-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"260aad7a-143f-47d5-ab79-5cfd8eb26eb9\") " pod="service-telemetry/qdr-test" Feb 23 00:22:53 crc kubenswrapper[4735]: I0223 00:22:53.926795 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/260aad7a-143f-47d5-ab79-5cfd8eb26eb9-qdr-test-config\") pod \"qdr-test\" (UID: \"260aad7a-143f-47d5-ab79-5cfd8eb26eb9\") " pod="service-telemetry/qdr-test" Feb 23 00:22:53 crc kubenswrapper[4735]: I0223 00:22:53.926840 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwdk9\" (UniqueName: \"kubernetes.io/projected/260aad7a-143f-47d5-ab79-5cfd8eb26eb9-kube-api-access-vwdk9\") pod \"qdr-test\" (UID: \"260aad7a-143f-47d5-ab79-5cfd8eb26eb9\") " pod="service-telemetry/qdr-test" Feb 23 00:22:53 crc kubenswrapper[4735]: I0223 00:22:53.927755 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/260aad7a-143f-47d5-ab79-5cfd8eb26eb9-qdr-test-config\") pod \"qdr-test\" (UID: \"260aad7a-143f-47d5-ab79-5cfd8eb26eb9\") " pod="service-telemetry/qdr-test" Feb 23 00:22:53 crc kubenswrapper[4735]: I0223 00:22:53.933432 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/260aad7a-143f-47d5-ab79-5cfd8eb26eb9-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"260aad7a-143f-47d5-ab79-5cfd8eb26eb9\") " pod="service-telemetry/qdr-test" Feb 23 00:22:53 crc kubenswrapper[4735]: I0223 00:22:53.950979 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwdk9\" (UniqueName: \"kubernetes.io/projected/260aad7a-143f-47d5-ab79-5cfd8eb26eb9-kube-api-access-vwdk9\") pod \"qdr-test\" (UID: \"260aad7a-143f-47d5-ab79-5cfd8eb26eb9\") " pod="service-telemetry/qdr-test" Feb 23 00:22:54 crc kubenswrapper[4735]: I0223 00:22:54.070177 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Feb 23 00:22:54 crc kubenswrapper[4735]: W0223 00:22:54.491742 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod260aad7a_143f_47d5_ab79_5cfd8eb26eb9.slice/crio-3a3b97f43da8df499f1c7d58fa602ee20fba9ceaf5770e751f23a69612d03554 WatchSource:0}: Error finding container 3a3b97f43da8df499f1c7d58fa602ee20fba9ceaf5770e751f23a69612d03554: Status 404 returned error can't find the container with id 3a3b97f43da8df499f1c7d58fa602ee20fba9ceaf5770e751f23a69612d03554 Feb 23 00:22:54 crc kubenswrapper[4735]: I0223 00:22:54.492070 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Feb 23 00:22:55 crc kubenswrapper[4735]: I0223 00:22:55.404601 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"260aad7a-143f-47d5-ab79-5cfd8eb26eb9","Type":"ContainerStarted","Data":"3a3b97f43da8df499f1c7d58fa602ee20fba9ceaf5770e751f23a69612d03554"} Feb 23 00:22:59 crc kubenswrapper[4735]: I0223 00:22:59.272443 4735 scope.go:117] "RemoveContainer" containerID="aa89c957f7e3dfd2e667b75ffc0e14c96484b81bcc30853af62f3ce5d8006de9" Feb 23 00:23:00 crc kubenswrapper[4735]: I0223 00:23:00.273038 4735 scope.go:117] "RemoveContainer" containerID="ba4aa684f42621405b4a8ab25dde3b01df635a7985763c9f3b0656a358b95a1d" Feb 23 00:23:00 crc kubenswrapper[4735]: I0223 00:23:00.273512 4735 scope.go:117] "RemoveContainer" containerID="635c2840601b87d82aec83b14845d25d8fa475d9f82b665e07acac77bc22d977" Feb 23 00:23:00 crc kubenswrapper[4735]: I0223 00:23:00.275675 4735 scope.go:117] "RemoveContainer" containerID="1068d6798f59cc8e435fc8d1b0403486a5a508285ec570a6a73ed0b9896fb19a" Feb 23 00:23:01 crc kubenswrapper[4735]: I0223 00:23:01.272038 4735 scope.go:117] "RemoveContainer" containerID="4d96b54908fe5d7163c29805f81e1f903416f08d126fae83a057a52450f00189" Feb 23 00:23:02 crc kubenswrapper[4735]: I0223 00:23:02.709212 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 00:23:05 crc kubenswrapper[4735]: I0223 00:23:05.500213 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht" event={"ID":"475fb5a3-ada6-41f2-baff-da287ca512f3","Type":"ContainerStarted","Data":"43d6980ebf2dbaee4e55632ae3a5d7cceac98a05582dbe02b45d168eed5bf9e3"} Feb 23 00:23:05 crc kubenswrapper[4735]: I0223 00:23:05.505680 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"260aad7a-143f-47d5-ab79-5cfd8eb26eb9","Type":"ContainerStarted","Data":"580e3f80f7b8a310443cebf63d40a482279c3519701a25112ef703bdeeba016b"} Feb 23 00:23:05 crc kubenswrapper[4735]: I0223 00:23:05.509996 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t" event={"ID":"1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d","Type":"ContainerStarted","Data":"92dc19930b354c91a842f3906dc09a68bbd545efe65b13ddd60933d2ba075fe2"} Feb 23 00:23:05 crc kubenswrapper[4735]: I0223 00:23:05.515149 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq" event={"ID":"76aa4cee-ada5-4bc9-854f-127390e15fd2","Type":"ContainerStarted","Data":"79faca6632579d5a07a85caacd5c844911221d1a45e0366240e2b4d24b9151d5"} Feb 23 00:23:05 crc kubenswrapper[4735]: I0223 00:23:05.521894 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7" event={"ID":"6e748a5f-2b4d-4815-9b8f-0429055eae26","Type":"ContainerStarted","Data":"147be3515dbb6d936a123f8e927876c40f9ce58900aa7b9448eece2459dfe8c4"} Feb 23 00:23:05 crc kubenswrapper[4735]: I0223 00:23:05.528562 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt" event={"ID":"582cde1e-9aab-4e5f-bad8-f9c2d01a5742","Type":"ContainerStarted","Data":"68dcf4507d34537c83a0fdc2c9a335730f1f587424b5c52896033f2241bf3d3f"} Feb 23 00:23:05 crc kubenswrapper[4735]: I0223 00:23:05.580943 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/qdr-test" podStartSLOduration=2.255555035 podStartE2EDuration="12.580925504s" podCreationTimestamp="2026-02-23 00:22:53 +0000 UTC" firstStartedPulling="2026-02-23 00:22:54.497008248 +0000 UTC m=+932.960554239" lastFinishedPulling="2026-02-23 00:23:04.822378737 +0000 UTC m=+943.285924708" observedRunningTime="2026-02-23 00:23:05.579271813 +0000 UTC m=+944.042817814" watchObservedRunningTime="2026-02-23 00:23:05.580925504 +0000 UTC m=+944.044471485" Feb 23 00:23:05 crc kubenswrapper[4735]: I0223 00:23:05.857243 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/stf-smoketest-smoke1-prvr4"] Feb 23 00:23:05 crc kubenswrapper[4735]: I0223 00:23:05.858628 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-prvr4" Feb 23 00:23:05 crc kubenswrapper[4735]: I0223 00:23:05.863538 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-sensubility-config" Feb 23 00:23:05 crc kubenswrapper[4735]: I0223 00:23:05.863553 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-entrypoint-script" Feb 23 00:23:05 crc kubenswrapper[4735]: I0223 00:23:05.863546 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-healthcheck-log" Feb 23 00:23:05 crc kubenswrapper[4735]: I0223 00:23:05.863651 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-entrypoint-script" Feb 23 00:23:05 crc kubenswrapper[4735]: I0223 00:23:05.863699 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-config" Feb 23 00:23:05 crc kubenswrapper[4735]: I0223 00:23:05.864319 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-publisher" Feb 23 00:23:05 crc kubenswrapper[4735]: I0223 00:23:05.872698 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-prvr4"] Feb 23 00:23:05 crc kubenswrapper[4735]: I0223 00:23:05.964861 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7847497b-f082-46df-991a-d403df7e9995-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-prvr4\" (UID: \"7847497b-f082-46df-991a-d403df7e9995\") " pod="service-telemetry/stf-smoketest-smoke1-prvr4" Feb 23 00:23:05 crc kubenswrapper[4735]: I0223 00:23:05.964932 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/7847497b-f082-46df-991a-d403df7e9995-healthcheck-log\") pod \"stf-smoketest-smoke1-prvr4\" (UID: \"7847497b-f082-46df-991a-d403df7e9995\") " pod="service-telemetry/stf-smoketest-smoke1-prvr4" Feb 23 00:23:05 crc kubenswrapper[4735]: I0223 00:23:05.964969 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7847497b-f082-46df-991a-d403df7e9995-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-prvr4\" (UID: \"7847497b-f082-46df-991a-d403df7e9995\") " pod="service-telemetry/stf-smoketest-smoke1-prvr4" Feb 23 00:23:05 crc kubenswrapper[4735]: I0223 00:23:05.964995 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/7847497b-f082-46df-991a-d403df7e9995-ceilometer-publisher\") pod \"stf-smoketest-smoke1-prvr4\" (UID: \"7847497b-f082-46df-991a-d403df7e9995\") " pod="service-telemetry/stf-smoketest-smoke1-prvr4" Feb 23 00:23:05 crc kubenswrapper[4735]: I0223 00:23:05.965016 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/7847497b-f082-46df-991a-d403df7e9995-sensubility-config\") pod \"stf-smoketest-smoke1-prvr4\" (UID: \"7847497b-f082-46df-991a-d403df7e9995\") " pod="service-telemetry/stf-smoketest-smoke1-prvr4" Feb 23 00:23:05 crc kubenswrapper[4735]: I0223 00:23:05.965042 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-642bd\" (UniqueName: \"kubernetes.io/projected/7847497b-f082-46df-991a-d403df7e9995-kube-api-access-642bd\") pod \"stf-smoketest-smoke1-prvr4\" (UID: \"7847497b-f082-46df-991a-d403df7e9995\") " pod="service-telemetry/stf-smoketest-smoke1-prvr4" Feb 23 00:23:05 crc kubenswrapper[4735]: I0223 00:23:05.965113 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/7847497b-f082-46df-991a-d403df7e9995-collectd-config\") pod \"stf-smoketest-smoke1-prvr4\" (UID: \"7847497b-f082-46df-991a-d403df7e9995\") " pod="service-telemetry/stf-smoketest-smoke1-prvr4" Feb 23 00:23:06 crc kubenswrapper[4735]: I0223 00:23:06.066645 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7847497b-f082-46df-991a-d403df7e9995-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-prvr4\" (UID: \"7847497b-f082-46df-991a-d403df7e9995\") " pod="service-telemetry/stf-smoketest-smoke1-prvr4" Feb 23 00:23:06 crc kubenswrapper[4735]: I0223 00:23:06.066695 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/7847497b-f082-46df-991a-d403df7e9995-ceilometer-publisher\") pod \"stf-smoketest-smoke1-prvr4\" (UID: \"7847497b-f082-46df-991a-d403df7e9995\") " pod="service-telemetry/stf-smoketest-smoke1-prvr4" Feb 23 00:23:06 crc kubenswrapper[4735]: I0223 00:23:06.066723 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/7847497b-f082-46df-991a-d403df7e9995-sensubility-config\") pod \"stf-smoketest-smoke1-prvr4\" (UID: \"7847497b-f082-46df-991a-d403df7e9995\") " pod="service-telemetry/stf-smoketest-smoke1-prvr4" Feb 23 00:23:06 crc kubenswrapper[4735]: I0223 00:23:06.066750 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-642bd\" (UniqueName: \"kubernetes.io/projected/7847497b-f082-46df-991a-d403df7e9995-kube-api-access-642bd\") pod \"stf-smoketest-smoke1-prvr4\" (UID: \"7847497b-f082-46df-991a-d403df7e9995\") " pod="service-telemetry/stf-smoketest-smoke1-prvr4" Feb 23 00:23:06 crc kubenswrapper[4735]: I0223 00:23:06.066775 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/7847497b-f082-46df-991a-d403df7e9995-collectd-config\") pod \"stf-smoketest-smoke1-prvr4\" (UID: \"7847497b-f082-46df-991a-d403df7e9995\") " pod="service-telemetry/stf-smoketest-smoke1-prvr4" Feb 23 00:23:06 crc kubenswrapper[4735]: I0223 00:23:06.066827 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7847497b-f082-46df-991a-d403df7e9995-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-prvr4\" (UID: \"7847497b-f082-46df-991a-d403df7e9995\") " pod="service-telemetry/stf-smoketest-smoke1-prvr4" Feb 23 00:23:06 crc kubenswrapper[4735]: I0223 00:23:06.066844 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/7847497b-f082-46df-991a-d403df7e9995-healthcheck-log\") pod \"stf-smoketest-smoke1-prvr4\" (UID: \"7847497b-f082-46df-991a-d403df7e9995\") " pod="service-telemetry/stf-smoketest-smoke1-prvr4" Feb 23 00:23:06 crc kubenswrapper[4735]: I0223 00:23:06.068090 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/7847497b-f082-46df-991a-d403df7e9995-healthcheck-log\") pod \"stf-smoketest-smoke1-prvr4\" (UID: \"7847497b-f082-46df-991a-d403df7e9995\") " pod="service-telemetry/stf-smoketest-smoke1-prvr4" Feb 23 00:23:06 crc kubenswrapper[4735]: I0223 00:23:06.068243 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7847497b-f082-46df-991a-d403df7e9995-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-prvr4\" (UID: \"7847497b-f082-46df-991a-d403df7e9995\") " pod="service-telemetry/stf-smoketest-smoke1-prvr4" Feb 23 00:23:06 crc kubenswrapper[4735]: I0223 00:23:06.068645 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/7847497b-f082-46df-991a-d403df7e9995-collectd-config\") pod \"stf-smoketest-smoke1-prvr4\" (UID: \"7847497b-f082-46df-991a-d403df7e9995\") " pod="service-telemetry/stf-smoketest-smoke1-prvr4" Feb 23 00:23:06 crc kubenswrapper[4735]: I0223 00:23:06.069026 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/7847497b-f082-46df-991a-d403df7e9995-sensubility-config\") pod \"stf-smoketest-smoke1-prvr4\" (UID: \"7847497b-f082-46df-991a-d403df7e9995\") " pod="service-telemetry/stf-smoketest-smoke1-prvr4" Feb 23 00:23:06 crc kubenswrapper[4735]: I0223 00:23:06.069066 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/7847497b-f082-46df-991a-d403df7e9995-ceilometer-publisher\") pod \"stf-smoketest-smoke1-prvr4\" (UID: \"7847497b-f082-46df-991a-d403df7e9995\") " pod="service-telemetry/stf-smoketest-smoke1-prvr4" Feb 23 00:23:06 crc kubenswrapper[4735]: I0223 00:23:06.069290 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7847497b-f082-46df-991a-d403df7e9995-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-prvr4\" (UID: \"7847497b-f082-46df-991a-d403df7e9995\") " pod="service-telemetry/stf-smoketest-smoke1-prvr4" Feb 23 00:23:06 crc kubenswrapper[4735]: I0223 00:23:06.101633 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-642bd\" (UniqueName: \"kubernetes.io/projected/7847497b-f082-46df-991a-d403df7e9995-kube-api-access-642bd\") pod \"stf-smoketest-smoke1-prvr4\" (UID: \"7847497b-f082-46df-991a-d403df7e9995\") " pod="service-telemetry/stf-smoketest-smoke1-prvr4" Feb 23 00:23:06 crc kubenswrapper[4735]: I0223 00:23:06.222773 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-prvr4" Feb 23 00:23:06 crc kubenswrapper[4735]: I0223 00:23:06.287115 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/curl"] Feb 23 00:23:06 crc kubenswrapper[4735]: I0223 00:23:06.287921 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Feb 23 00:23:06 crc kubenswrapper[4735]: I0223 00:23:06.288043 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Feb 23 00:23:06 crc kubenswrapper[4735]: I0223 00:23:06.472618 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkbrm\" (UniqueName: \"kubernetes.io/projected/6592f4c0-2d9d-46ab-9dfb-c551f512ad6c-kube-api-access-zkbrm\") pod \"curl\" (UID: \"6592f4c0-2d9d-46ab-9dfb-c551f512ad6c\") " pod="service-telemetry/curl" Feb 23 00:23:06 crc kubenswrapper[4735]: I0223 00:23:06.573771 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkbrm\" (UniqueName: \"kubernetes.io/projected/6592f4c0-2d9d-46ab-9dfb-c551f512ad6c-kube-api-access-zkbrm\") pod \"curl\" (UID: \"6592f4c0-2d9d-46ab-9dfb-c551f512ad6c\") " pod="service-telemetry/curl" Feb 23 00:23:06 crc kubenswrapper[4735]: I0223 00:23:06.591123 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkbrm\" (UniqueName: \"kubernetes.io/projected/6592f4c0-2d9d-46ab-9dfb-c551f512ad6c-kube-api-access-zkbrm\") pod \"curl\" (UID: \"6592f4c0-2d9d-46ab-9dfb-c551f512ad6c\") " pod="service-telemetry/curl" Feb 23 00:23:06 crc kubenswrapper[4735]: I0223 00:23:06.624676 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Feb 23 00:23:06 crc kubenswrapper[4735]: I0223 00:23:06.690070 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-prvr4"] Feb 23 00:23:06 crc kubenswrapper[4735]: I0223 00:23:06.830500 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Feb 23 00:23:06 crc kubenswrapper[4735]: W0223 00:23:06.833265 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6592f4c0_2d9d_46ab_9dfb_c551f512ad6c.slice/crio-6e03807da7bbcfbf7d2b9071aa0d3547fcc6a29f9b38d0bdb393bb85fa9b26ef WatchSource:0}: Error finding container 6e03807da7bbcfbf7d2b9071aa0d3547fcc6a29f9b38d0bdb393bb85fa9b26ef: Status 404 returned error can't find the container with id 6e03807da7bbcfbf7d2b9071aa0d3547fcc6a29f9b38d0bdb393bb85fa9b26ef Feb 23 00:23:07 crc kubenswrapper[4735]: I0223 00:23:07.561123 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"6592f4c0-2d9d-46ab-9dfb-c551f512ad6c","Type":"ContainerStarted","Data":"6e03807da7bbcfbf7d2b9071aa0d3547fcc6a29f9b38d0bdb393bb85fa9b26ef"} Feb 23 00:23:07 crc kubenswrapper[4735]: I0223 00:23:07.562458 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-prvr4" event={"ID":"7847497b-f082-46df-991a-d403df7e9995","Type":"ContainerStarted","Data":"5227e4c91ccf2b8f516302f3bb98c3f8a1152e3f2bfaa814d13994d1d45eba99"} Feb 23 00:23:08 crc kubenswrapper[4735]: I0223 00:23:08.572208 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"6592f4c0-2d9d-46ab-9dfb-c551f512ad6c","Type":"ContainerStarted","Data":"8862d39791033a2915e3952f7f2d2a215be9b3d720a5945df8547eb0efc2ed80"} Feb 23 00:23:08 crc kubenswrapper[4735]: I0223 00:23:08.590826 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/curl" podStartSLOduration=1.021755634 podStartE2EDuration="2.590806925s" podCreationTimestamp="2026-02-23 00:23:06 +0000 UTC" firstStartedPulling="2026-02-23 00:23:06.835202725 +0000 UTC m=+945.298748696" lastFinishedPulling="2026-02-23 00:23:08.404254016 +0000 UTC m=+946.867799987" observedRunningTime="2026-02-23 00:23:08.586154673 +0000 UTC m=+947.049700644" watchObservedRunningTime="2026-02-23 00:23:08.590806925 +0000 UTC m=+947.054352896" Feb 23 00:23:09 crc kubenswrapper[4735]: I0223 00:23:09.580694 4735 generic.go:334] "Generic (PLEG): container finished" podID="6592f4c0-2d9d-46ab-9dfb-c551f512ad6c" containerID="8862d39791033a2915e3952f7f2d2a215be9b3d720a5945df8547eb0efc2ed80" exitCode=0 Feb 23 00:23:09 crc kubenswrapper[4735]: I0223 00:23:09.580764 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"6592f4c0-2d9d-46ab-9dfb-c551f512ad6c","Type":"ContainerDied","Data":"8862d39791033a2915e3952f7f2d2a215be9b3d720a5945df8547eb0efc2ed80"} Feb 23 00:23:11 crc kubenswrapper[4735]: I0223 00:23:11.512800 4735 patch_prober.go:28] interesting pod/machine-config-daemon-blmnv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 00:23:11 crc kubenswrapper[4735]: I0223 00:23:11.513209 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" podUID="1cba474f-2d55-4a07-969f-25e2817a06d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 00:23:19 crc kubenswrapper[4735]: E0223 00:23:19.880615 4735 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/tripleomastercentos9/openstack-collectd:current-tripleo" Feb 23 00:23:19 crc kubenswrapper[4735]: E0223 00:23:19.881419 4735 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:smoketest-collectd,Image:quay.io/tripleomastercentos9/openstack-collectd:current-tripleo,Command:[/smoketest_collectd_entrypoint.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CLOUDNAME,Value:smoke1,ValueFrom:nil,},EnvVar{Name:ELASTICSEARCH_AUTH_PASS,Value:fZAaeWSbTOCIKNXShvmwSyjn,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_AUTH_TOKEN,Value:eyJhbGciOiJSUzI1NiIsImtpZCI6InF6SnFxNFFjbVk5VmJQZ2dNMmUxdHFmTlJlVWx4UDhSTlhIamV3RUx4WU0ifQ.eyJhdWQiOlsiaHR0cHM6Ly9rdWJlcm5ldGVzLmRlZmF1bHQuc3ZjIl0sImV4cCI6MTc3MTgwOTc2OCwiaWF0IjoxNzcxODA2MTY4LCJpc3MiOiJodHRwczovL2t1YmVybmV0ZXMuZGVmYXVsdC5zdmMiLCJqdGkiOiI5MTI0MDE0NC1lZDY3LTRlYWUtOWZkMy1jZmZiNzRlYmZkOWIiLCJrdWJlcm5ldGVzLmlvIjp7Im5hbWVzcGFjZSI6InNlcnZpY2UtdGVsZW1ldHJ5Iiwic2VydmljZWFjY291bnQiOnsibmFtZSI6InN0Zi1wcm9tZXRoZXVzLXJlYWRlciIsInVpZCI6IjQ0NjY2YTMyLTgzOWQtNGMxOC04OWI0LTliNzI0NDQ5YjkyZiJ9fSwibmJmIjoxNzcxODA2MTY4LCJzdWIiOiJzeXN0ZW06c2VydmljZWFjY291bnQ6c2VydmljZS10ZWxlbWV0cnk6c3RmLXByb21ldGhldXMtcmVhZGVyIn0.ejuJ4YEkt77Zz0EAAmtUPzw0wiUQSC5HPSPDkSzKJXQx3IbzLT41bNUIgAeF19lYBOb7RN-SNHgqOX9BZuoPcJnMiwUoJCnuh6WVcrHrj3OqXS_GAw_95uCQeS6KfL5ynHHvCRTeiKmDaMSx7mO3f0DOv_fvtC-kVj3TsEoLHnrUlvL-p2ki_Il3ZHQU9InaOtEUrZIfhWdzbU0A19slkGCX5_lcOWfYnTnYMM3Mvci1Tu2iNUNgEVQlNKCtXO3btluEyLbJABL3UH8v6KHwmYXKWoKda7ym_aeOHOss7izCzDguhnOP8Akmo4g1GiFICI5vNTGxfFzVg2kiKPIBS60kNJZJ0cTqGNQ1oaluF_68G2Y3zBITmpSLPbi0ahXEcYpX1WDeN7MvQGTUWft0_ulz0f_LsyhUrFUuXM55_vLIuKeoaZqDcgv4ORcfQC5nvc3yxDgg4ufhElgkjMVuB7fWWOM_RjsKCym_wH1vK9qI3l9tlPnaco0cklhdSVTw8UeHMN78EkRaAj0J0rf9mJXlJiNEW4kRDHVk4JgjeYV7wgzxbVsjGPK6FZ2Ea-XO9xM177nxMLsvl2KZc6MQmox9e8Ek5Sk8TfJSZ7A4JoqUbWKp343Zyy38D18Dm4Z9eHF-zNE7Q5f2xyGVxpKaO-0oz_lT2-BAKEKWSGicBZ8,ValueFrom:nil,},EnvVar{Name:OBSERVABILITY_STRATEGY,Value:<>,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:collectd-config,ReadOnly:false,MountPath:/etc/minimal-collectd.conf.template,SubPath:minimal-collectd.conf.template,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:sensubility-config,ReadOnly:false,MountPath:/etc/collectd-sensubility.conf,SubPath:collectd-sensubility.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:healthcheck-log,ReadOnly:false,MountPath:/healthcheck.log,SubPath:healthcheck.log,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:collectd-entrypoint-script,ReadOnly:false,MountPath:/smoketest_collectd_entrypoint.sh,SubPath:smoketest_collectd_entrypoint.sh,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-642bd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod stf-smoketest-smoke1-prvr4_service-telemetry(7847497b-f082-46df-991a-d403df7e9995): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 23 00:23:19 crc kubenswrapper[4735]: I0223 00:23:19.899769 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Feb 23 00:23:20 crc kubenswrapper[4735]: I0223 00:23:20.075622 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_curl_6592f4c0-2d9d-46ab-9dfb-c551f512ad6c/curl/0.log" Feb 23 00:23:20 crc kubenswrapper[4735]: I0223 00:23:20.083270 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkbrm\" (UniqueName: \"kubernetes.io/projected/6592f4c0-2d9d-46ab-9dfb-c551f512ad6c-kube-api-access-zkbrm\") pod \"6592f4c0-2d9d-46ab-9dfb-c551f512ad6c\" (UID: \"6592f4c0-2d9d-46ab-9dfb-c551f512ad6c\") " Feb 23 00:23:20 crc kubenswrapper[4735]: I0223 00:23:20.092378 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6592f4c0-2d9d-46ab-9dfb-c551f512ad6c-kube-api-access-zkbrm" (OuterVolumeSpecName: "kube-api-access-zkbrm") pod "6592f4c0-2d9d-46ab-9dfb-c551f512ad6c" (UID: "6592f4c0-2d9d-46ab-9dfb-c551f512ad6c"). InnerVolumeSpecName "kube-api-access-zkbrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:23:20 crc kubenswrapper[4735]: I0223 00:23:20.190667 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkbrm\" (UniqueName: \"kubernetes.io/projected/6592f4c0-2d9d-46ab-9dfb-c551f512ad6c-kube-api-access-zkbrm\") on node \"crc\" DevicePath \"\"" Feb 23 00:23:20 crc kubenswrapper[4735]: I0223 00:23:20.376169 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-78bcbbdcff-52mm9_213ffefc-86ab-4944-a9b5-888a99455d05/prometheus-webhook-snmp/0.log" Feb 23 00:23:20 crc kubenswrapper[4735]: I0223 00:23:20.666928 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"6592f4c0-2d9d-46ab-9dfb-c551f512ad6c","Type":"ContainerDied","Data":"6e03807da7bbcfbf7d2b9071aa0d3547fcc6a29f9b38d0bdb393bb85fa9b26ef"} Feb 23 00:23:20 crc kubenswrapper[4735]: I0223 00:23:20.666994 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e03807da7bbcfbf7d2b9071aa0d3547fcc6a29f9b38d0bdb393bb85fa9b26ef" Feb 23 00:23:20 crc kubenswrapper[4735]: I0223 00:23:20.667104 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Feb 23 00:23:26 crc kubenswrapper[4735]: E0223 00:23:26.172407 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"smoketest-collectd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/stf-smoketest-smoke1-prvr4" podUID="7847497b-f082-46df-991a-d403df7e9995" Feb 23 00:23:26 crc kubenswrapper[4735]: I0223 00:23:26.778358 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-prvr4" event={"ID":"7847497b-f082-46df-991a-d403df7e9995","Type":"ContainerStarted","Data":"a9a8d66c7f96d75f46ab78360405d5b34d16fa61eb7bf7af68210c522c6664c1"} Feb 23 00:23:26 crc kubenswrapper[4735]: E0223 00:23:26.780142 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"smoketest-collectd\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/tripleomastercentos9/openstack-collectd:current-tripleo\\\"\"" pod="service-telemetry/stf-smoketest-smoke1-prvr4" podUID="7847497b-f082-46df-991a-d403df7e9995" Feb 23 00:23:27 crc kubenswrapper[4735]: E0223 00:23:27.787115 4735 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"smoketest-collectd\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/tripleomastercentos9/openstack-collectd:current-tripleo\\\"\"" pod="service-telemetry/stf-smoketest-smoke1-prvr4" podUID="7847497b-f082-46df-991a-d403df7e9995" Feb 23 00:23:40 crc kubenswrapper[4735]: I0223 00:23:40.912969 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-prvr4" event={"ID":"7847497b-f082-46df-991a-d403df7e9995","Type":"ContainerStarted","Data":"97c15cc18feb6ea5af6d2e037086685d11bd8286df1cd7501ab1f224b62e26c5"} Feb 23 00:23:40 crc kubenswrapper[4735]: I0223 00:23:40.955032 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/stf-smoketest-smoke1-prvr4" podStartSLOduration=2.9505483630000002 podStartE2EDuration="35.955009284s" podCreationTimestamp="2026-02-23 00:23:05 +0000 UTC" firstStartedPulling="2026-02-23 00:23:06.713369819 +0000 UTC m=+945.176915790" lastFinishedPulling="2026-02-23 00:23:39.7178307 +0000 UTC m=+978.181376711" observedRunningTime="2026-02-23 00:23:40.948973018 +0000 UTC m=+979.412519029" watchObservedRunningTime="2026-02-23 00:23:40.955009284 +0000 UTC m=+979.418555285" Feb 23 00:23:41 crc kubenswrapper[4735]: I0223 00:23:41.512725 4735 patch_prober.go:28] interesting pod/machine-config-daemon-blmnv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 00:23:41 crc kubenswrapper[4735]: I0223 00:23:41.512810 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" podUID="1cba474f-2d55-4a07-969f-25e2817a06d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 00:23:50 crc kubenswrapper[4735]: I0223 00:23:50.526361 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-78bcbbdcff-52mm9_213ffefc-86ab-4944-a9b5-888a99455d05/prometheus-webhook-snmp/0.log" Feb 23 00:23:58 crc kubenswrapper[4735]: I0223 00:23:58.069927 4735 generic.go:334] "Generic (PLEG): container finished" podID="7847497b-f082-46df-991a-d403df7e9995" containerID="a9a8d66c7f96d75f46ab78360405d5b34d16fa61eb7bf7af68210c522c6664c1" exitCode=0 Feb 23 00:23:58 crc kubenswrapper[4735]: I0223 00:23:58.070053 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-prvr4" event={"ID":"7847497b-f082-46df-991a-d403df7e9995","Type":"ContainerDied","Data":"a9a8d66c7f96d75f46ab78360405d5b34d16fa61eb7bf7af68210c522c6664c1"} Feb 23 00:23:58 crc kubenswrapper[4735]: I0223 00:23:58.072002 4735 scope.go:117] "RemoveContainer" containerID="a9a8d66c7f96d75f46ab78360405d5b34d16fa61eb7bf7af68210c522c6664c1" Feb 23 00:24:11 crc kubenswrapper[4735]: I0223 00:24:11.512953 4735 patch_prober.go:28] interesting pod/machine-config-daemon-blmnv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 00:24:11 crc kubenswrapper[4735]: I0223 00:24:11.513592 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" podUID="1cba474f-2d55-4a07-969f-25e2817a06d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 00:24:11 crc kubenswrapper[4735]: I0223 00:24:11.513649 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" Feb 23 00:24:11 crc kubenswrapper[4735]: I0223 00:24:11.514625 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a99b454aab2d4f811a91442480520f4ceedbb717995dcf3a4814eb9e1442c818"} pod="openshift-machine-config-operator/machine-config-daemon-blmnv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 00:24:11 crc kubenswrapper[4735]: I0223 00:24:11.514695 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" podUID="1cba474f-2d55-4a07-969f-25e2817a06d0" containerName="machine-config-daemon" containerID="cri-o://a99b454aab2d4f811a91442480520f4ceedbb717995dcf3a4814eb9e1442c818" gracePeriod=600 Feb 23 00:24:12 crc kubenswrapper[4735]: I0223 00:24:12.228983 4735 generic.go:334] "Generic (PLEG): container finished" podID="1cba474f-2d55-4a07-969f-25e2817a06d0" containerID="a99b454aab2d4f811a91442480520f4ceedbb717995dcf3a4814eb9e1442c818" exitCode=0 Feb 23 00:24:12 crc kubenswrapper[4735]: I0223 00:24:12.229088 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" event={"ID":"1cba474f-2d55-4a07-969f-25e2817a06d0","Type":"ContainerDied","Data":"a99b454aab2d4f811a91442480520f4ceedbb717995dcf3a4814eb9e1442c818"} Feb 23 00:24:12 crc kubenswrapper[4735]: I0223 00:24:12.229563 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" event={"ID":"1cba474f-2d55-4a07-969f-25e2817a06d0","Type":"ContainerStarted","Data":"f0184932c4f2bec10d1885f359e7f020b9e339d1e4ee18bc097d894c032130ad"} Feb 23 00:24:12 crc kubenswrapper[4735]: I0223 00:24:12.229597 4735 scope.go:117] "RemoveContainer" containerID="6d18ff42046d8089570f691ef425e2b8cbc857ffa40454ed7ce709bd6b34ea17" Feb 23 00:24:14 crc kubenswrapper[4735]: I0223 00:24:14.253687 4735 generic.go:334] "Generic (PLEG): container finished" podID="7847497b-f082-46df-991a-d403df7e9995" containerID="97c15cc18feb6ea5af6d2e037086685d11bd8286df1cd7501ab1f224b62e26c5" exitCode=0 Feb 23 00:24:14 crc kubenswrapper[4735]: I0223 00:24:14.253807 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-prvr4" event={"ID":"7847497b-f082-46df-991a-d403df7e9995","Type":"ContainerDied","Data":"97c15cc18feb6ea5af6d2e037086685d11bd8286df1cd7501ab1f224b62e26c5"} Feb 23 00:24:15 crc kubenswrapper[4735]: I0223 00:24:15.616206 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-prvr4" Feb 23 00:24:15 crc kubenswrapper[4735]: I0223 00:24:15.724902 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/7847497b-f082-46df-991a-d403df7e9995-healthcheck-log\") pod \"7847497b-f082-46df-991a-d403df7e9995\" (UID: \"7847497b-f082-46df-991a-d403df7e9995\") " Feb 23 00:24:15 crc kubenswrapper[4735]: I0223 00:24:15.724992 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/7847497b-f082-46df-991a-d403df7e9995-ceilometer-publisher\") pod \"7847497b-f082-46df-991a-d403df7e9995\" (UID: \"7847497b-f082-46df-991a-d403df7e9995\") " Feb 23 00:24:15 crc kubenswrapper[4735]: I0223 00:24:15.725055 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-642bd\" (UniqueName: \"kubernetes.io/projected/7847497b-f082-46df-991a-d403df7e9995-kube-api-access-642bd\") pod \"7847497b-f082-46df-991a-d403df7e9995\" (UID: \"7847497b-f082-46df-991a-d403df7e9995\") " Feb 23 00:24:15 crc kubenswrapper[4735]: I0223 00:24:15.725152 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/7847497b-f082-46df-991a-d403df7e9995-collectd-config\") pod \"7847497b-f082-46df-991a-d403df7e9995\" (UID: \"7847497b-f082-46df-991a-d403df7e9995\") " Feb 23 00:24:15 crc kubenswrapper[4735]: I0223 00:24:15.725234 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7847497b-f082-46df-991a-d403df7e9995-collectd-entrypoint-script\") pod \"7847497b-f082-46df-991a-d403df7e9995\" (UID: \"7847497b-f082-46df-991a-d403df7e9995\") " Feb 23 00:24:15 crc kubenswrapper[4735]: I0223 00:24:15.725274 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/7847497b-f082-46df-991a-d403df7e9995-sensubility-config\") pod \"7847497b-f082-46df-991a-d403df7e9995\" (UID: \"7847497b-f082-46df-991a-d403df7e9995\") " Feb 23 00:24:15 crc kubenswrapper[4735]: I0223 00:24:15.725310 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7847497b-f082-46df-991a-d403df7e9995-ceilometer-entrypoint-script\") pod \"7847497b-f082-46df-991a-d403df7e9995\" (UID: \"7847497b-f082-46df-991a-d403df7e9995\") " Feb 23 00:24:15 crc kubenswrapper[4735]: I0223 00:24:15.734938 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7847497b-f082-46df-991a-d403df7e9995-kube-api-access-642bd" (OuterVolumeSpecName: "kube-api-access-642bd") pod "7847497b-f082-46df-991a-d403df7e9995" (UID: "7847497b-f082-46df-991a-d403df7e9995"). InnerVolumeSpecName "kube-api-access-642bd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:24:15 crc kubenswrapper[4735]: I0223 00:24:15.748632 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7847497b-f082-46df-991a-d403df7e9995-healthcheck-log" (OuterVolumeSpecName: "healthcheck-log") pod "7847497b-f082-46df-991a-d403df7e9995" (UID: "7847497b-f082-46df-991a-d403df7e9995"). InnerVolumeSpecName "healthcheck-log". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:24:15 crc kubenswrapper[4735]: I0223 00:24:15.755212 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7847497b-f082-46df-991a-d403df7e9995-ceilometer-publisher" (OuterVolumeSpecName: "ceilometer-publisher") pod "7847497b-f082-46df-991a-d403df7e9995" (UID: "7847497b-f082-46df-991a-d403df7e9995"). InnerVolumeSpecName "ceilometer-publisher". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:24:15 crc kubenswrapper[4735]: I0223 00:24:15.757485 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7847497b-f082-46df-991a-d403df7e9995-ceilometer-entrypoint-script" (OuterVolumeSpecName: "ceilometer-entrypoint-script") pod "7847497b-f082-46df-991a-d403df7e9995" (UID: "7847497b-f082-46df-991a-d403df7e9995"). InnerVolumeSpecName "ceilometer-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:24:15 crc kubenswrapper[4735]: I0223 00:24:15.761502 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7847497b-f082-46df-991a-d403df7e9995-collectd-entrypoint-script" (OuterVolumeSpecName: "collectd-entrypoint-script") pod "7847497b-f082-46df-991a-d403df7e9995" (UID: "7847497b-f082-46df-991a-d403df7e9995"). InnerVolumeSpecName "collectd-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:24:15 crc kubenswrapper[4735]: I0223 00:24:15.765835 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7847497b-f082-46df-991a-d403df7e9995-sensubility-config" (OuterVolumeSpecName: "sensubility-config") pod "7847497b-f082-46df-991a-d403df7e9995" (UID: "7847497b-f082-46df-991a-d403df7e9995"). InnerVolumeSpecName "sensubility-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:24:15 crc kubenswrapper[4735]: I0223 00:24:15.767342 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7847497b-f082-46df-991a-d403df7e9995-collectd-config" (OuterVolumeSpecName: "collectd-config") pod "7847497b-f082-46df-991a-d403df7e9995" (UID: "7847497b-f082-46df-991a-d403df7e9995"). InnerVolumeSpecName "collectd-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:24:15 crc kubenswrapper[4735]: I0223 00:24:15.827227 4735 reconciler_common.go:293] "Volume detached for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7847497b-f082-46df-991a-d403df7e9995-collectd-entrypoint-script\") on node \"crc\" DevicePath \"\"" Feb 23 00:24:15 crc kubenswrapper[4735]: I0223 00:24:15.827286 4735 reconciler_common.go:293] "Volume detached for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/7847497b-f082-46df-991a-d403df7e9995-sensubility-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:24:15 crc kubenswrapper[4735]: I0223 00:24:15.827309 4735 reconciler_common.go:293] "Volume detached for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7847497b-f082-46df-991a-d403df7e9995-ceilometer-entrypoint-script\") on node \"crc\" DevicePath \"\"" Feb 23 00:24:15 crc kubenswrapper[4735]: I0223 00:24:15.827329 4735 reconciler_common.go:293] "Volume detached for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/7847497b-f082-46df-991a-d403df7e9995-healthcheck-log\") on node \"crc\" DevicePath \"\"" Feb 23 00:24:15 crc kubenswrapper[4735]: I0223 00:24:15.827349 4735 reconciler_common.go:293] "Volume detached for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/7847497b-f082-46df-991a-d403df7e9995-ceilometer-publisher\") on node \"crc\" DevicePath \"\"" Feb 23 00:24:15 crc kubenswrapper[4735]: I0223 00:24:15.827368 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-642bd\" (UniqueName: \"kubernetes.io/projected/7847497b-f082-46df-991a-d403df7e9995-kube-api-access-642bd\") on node \"crc\" DevicePath \"\"" Feb 23 00:24:15 crc kubenswrapper[4735]: I0223 00:24:15.827385 4735 reconciler_common.go:293] "Volume detached for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/7847497b-f082-46df-991a-d403df7e9995-collectd-config\") on node \"crc\" DevicePath \"\"" Feb 23 00:24:16 crc kubenswrapper[4735]: I0223 00:24:16.272584 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-prvr4" Feb 23 00:24:16 crc kubenswrapper[4735]: I0223 00:24:16.283303 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-prvr4" event={"ID":"7847497b-f082-46df-991a-d403df7e9995","Type":"ContainerDied","Data":"5227e4c91ccf2b8f516302f3bb98c3f8a1152e3f2bfaa814d13994d1d45eba99"} Feb 23 00:24:16 crc kubenswrapper[4735]: I0223 00:24:16.283347 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5227e4c91ccf2b8f516302f3bb98c3f8a1152e3f2bfaa814d13994d1d45eba99" Feb 23 00:24:17 crc kubenswrapper[4735]: I0223 00:24:17.800802 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-prvr4_7847497b-f082-46df-991a-d403df7e9995/smoketest-collectd/0.log" Feb 23 00:24:18 crc kubenswrapper[4735]: I0223 00:24:18.130295 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-prvr4_7847497b-f082-46df-991a-d403df7e9995/smoketest-ceilometer/0.log" Feb 23 00:24:18 crc kubenswrapper[4735]: I0223 00:24:18.455694 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-interconnect-68864d46cb-p5qtk_fed58dc1-ad7e-4c98-b914-6c72c64c1367/default-interconnect/0.log" Feb 23 00:24:18 crc kubenswrapper[4735]: I0223 00:24:18.764068 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq_76aa4cee-ada5-4bc9-854f-127390e15fd2/bridge/2.log" Feb 23 00:24:19 crc kubenswrapper[4735]: I0223 00:24:19.037141 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7996dc9458-qswlq_76aa4cee-ada5-4bc9-854f-127390e15fd2/sg-core/0.log" Feb 23 00:24:19 crc kubenswrapper[4735]: I0223 00:24:19.269087 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t_1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d/bridge/2.log" Feb 23 00:24:19 crc kubenswrapper[4735]: I0223 00:24:19.597550 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-7595f5577f-ddp8t_1b92acb6-6ef0-47d8-a7a4-ae61aaeb513d/sg-core/0.log" Feb 23 00:24:19 crc kubenswrapper[4735]: I0223 00:24:19.884187 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht_475fb5a3-ada6-41f2-baff-da287ca512f3/bridge/2.log" Feb 23 00:24:20 crc kubenswrapper[4735]: I0223 00:24:20.186449 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-b57f974ff-92cht_475fb5a3-ada6-41f2-baff-da287ca512f3/sg-core/0.log" Feb 23 00:24:20 crc kubenswrapper[4735]: I0223 00:24:20.495388 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt_582cde1e-9aab-4e5f-bad8-f9c2d01a5742/bridge/2.log" Feb 23 00:24:20 crc kubenswrapper[4735]: I0223 00:24:20.781936 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-5945b6d778-p4wnt_582cde1e-9aab-4e5f-bad8-f9c2d01a5742/sg-core/0.log" Feb 23 00:24:21 crc kubenswrapper[4735]: I0223 00:24:21.065494 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7_6e748a5f-2b4d-4815-9b8f-0429055eae26/bridge/2.log" Feb 23 00:24:21 crc kubenswrapper[4735]: I0223 00:24:21.330445 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-6864f4fb65-mzst7_6e748a5f-2b4d-4815-9b8f-0429055eae26/sg-core/0.log" Feb 23 00:24:24 crc kubenswrapper[4735]: I0223 00:24:24.118299 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bbbc889bc-wbwd6_b1475d02-2bda-43e0-adc0-064144972174/operator/0.log" Feb 23 00:24:24 crc kubenswrapper[4735]: I0223 00:24:24.492258 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-default-0_f1292e6a-8c09-4732-8766-c0ebcefbde0a/prometheus/0.log" Feb 23 00:24:24 crc kubenswrapper[4735]: I0223 00:24:24.833607 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_elasticsearch-es-default-0_aafde9e6-828a-44a3-b5e9-8ec98a576b23/elasticsearch/0.log" Feb 23 00:24:25 crc kubenswrapper[4735]: I0223 00:24:25.174271 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-78bcbbdcff-52mm9_213ffefc-86ab-4944-a9b5-888a99455d05/prometheus-webhook-snmp/0.log" Feb 23 00:24:25 crc kubenswrapper[4735]: I0223 00:24:25.537104 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_alertmanager-default-0_1ad20943-871d-4331-a0e1-5145da93efee/alertmanager/0.log" Feb 23 00:24:39 crc kubenswrapper[4735]: I0223 00:24:39.025089 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-55b89ddfb9-frnfs_8be8f82e-4edc-4812-a4d0-2e3dda7788bf/operator/0.log" Feb 23 00:24:42 crc kubenswrapper[4735]: I0223 00:24:42.491296 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bbbc889bc-wbwd6_b1475d02-2bda-43e0-adc0-064144972174/operator/0.log" Feb 23 00:24:42 crc kubenswrapper[4735]: I0223 00:24:42.804716 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_qdr-test_260aad7a-143f-47d5-ab79-5cfd8eb26eb9/qdr/0.log" Feb 23 00:25:17 crc kubenswrapper[4735]: I0223 00:25:17.146194 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lsmrl/must-gather-v6mtr"] Feb 23 00:25:17 crc kubenswrapper[4735]: E0223 00:25:17.147310 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7847497b-f082-46df-991a-d403df7e9995" containerName="smoketest-collectd" Feb 23 00:25:17 crc kubenswrapper[4735]: I0223 00:25:17.147335 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7847497b-f082-46df-991a-d403df7e9995" containerName="smoketest-collectd" Feb 23 00:25:17 crc kubenswrapper[4735]: E0223 00:25:17.147355 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7847497b-f082-46df-991a-d403df7e9995" containerName="smoketest-ceilometer" Feb 23 00:25:17 crc kubenswrapper[4735]: I0223 00:25:17.147367 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="7847497b-f082-46df-991a-d403df7e9995" containerName="smoketest-ceilometer" Feb 23 00:25:17 crc kubenswrapper[4735]: E0223 00:25:17.147391 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6592f4c0-2d9d-46ab-9dfb-c551f512ad6c" containerName="curl" Feb 23 00:25:17 crc kubenswrapper[4735]: I0223 00:25:17.147403 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="6592f4c0-2d9d-46ab-9dfb-c551f512ad6c" containerName="curl" Feb 23 00:25:17 crc kubenswrapper[4735]: I0223 00:25:17.147698 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="6592f4c0-2d9d-46ab-9dfb-c551f512ad6c" containerName="curl" Feb 23 00:25:17 crc kubenswrapper[4735]: I0223 00:25:17.147739 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7847497b-f082-46df-991a-d403df7e9995" containerName="smoketest-ceilometer" Feb 23 00:25:17 crc kubenswrapper[4735]: I0223 00:25:17.147758 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="7847497b-f082-46df-991a-d403df7e9995" containerName="smoketest-collectd" Feb 23 00:25:17 crc kubenswrapper[4735]: I0223 00:25:17.149023 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lsmrl/must-gather-v6mtr" Feb 23 00:25:17 crc kubenswrapper[4735]: I0223 00:25:17.159213 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8znhz\" (UniqueName: \"kubernetes.io/projected/1c577a42-4dda-48e8-9daf-ab693312a53d-kube-api-access-8znhz\") pod \"must-gather-v6mtr\" (UID: \"1c577a42-4dda-48e8-9daf-ab693312a53d\") " pod="openshift-must-gather-lsmrl/must-gather-v6mtr" Feb 23 00:25:17 crc kubenswrapper[4735]: I0223 00:25:17.159274 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1c577a42-4dda-48e8-9daf-ab693312a53d-must-gather-output\") pod \"must-gather-v6mtr\" (UID: \"1c577a42-4dda-48e8-9daf-ab693312a53d\") " pod="openshift-must-gather-lsmrl/must-gather-v6mtr" Feb 23 00:25:17 crc kubenswrapper[4735]: I0223 00:25:17.160101 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-lsmrl"/"openshift-service-ca.crt" Feb 23 00:25:17 crc kubenswrapper[4735]: I0223 00:25:17.160131 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-lsmrl"/"default-dockercfg-65gzl" Feb 23 00:25:17 crc kubenswrapper[4735]: I0223 00:25:17.160597 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-lsmrl"/"kube-root-ca.crt" Feb 23 00:25:17 crc kubenswrapper[4735]: I0223 00:25:17.167843 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lsmrl/must-gather-v6mtr"] Feb 23 00:25:17 crc kubenswrapper[4735]: I0223 00:25:17.260483 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8znhz\" (UniqueName: \"kubernetes.io/projected/1c577a42-4dda-48e8-9daf-ab693312a53d-kube-api-access-8znhz\") pod \"must-gather-v6mtr\" (UID: \"1c577a42-4dda-48e8-9daf-ab693312a53d\") " pod="openshift-must-gather-lsmrl/must-gather-v6mtr" Feb 23 00:25:17 crc kubenswrapper[4735]: I0223 00:25:17.260529 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1c577a42-4dda-48e8-9daf-ab693312a53d-must-gather-output\") pod \"must-gather-v6mtr\" (UID: \"1c577a42-4dda-48e8-9daf-ab693312a53d\") " pod="openshift-must-gather-lsmrl/must-gather-v6mtr" Feb 23 00:25:17 crc kubenswrapper[4735]: I0223 00:25:17.260956 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1c577a42-4dda-48e8-9daf-ab693312a53d-must-gather-output\") pod \"must-gather-v6mtr\" (UID: \"1c577a42-4dda-48e8-9daf-ab693312a53d\") " pod="openshift-must-gather-lsmrl/must-gather-v6mtr" Feb 23 00:25:17 crc kubenswrapper[4735]: I0223 00:25:17.279554 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8znhz\" (UniqueName: \"kubernetes.io/projected/1c577a42-4dda-48e8-9daf-ab693312a53d-kube-api-access-8znhz\") pod \"must-gather-v6mtr\" (UID: \"1c577a42-4dda-48e8-9daf-ab693312a53d\") " pod="openshift-must-gather-lsmrl/must-gather-v6mtr" Feb 23 00:25:17 crc kubenswrapper[4735]: I0223 00:25:17.468052 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lsmrl/must-gather-v6mtr" Feb 23 00:25:17 crc kubenswrapper[4735]: I0223 00:25:17.711437 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lsmrl/must-gather-v6mtr"] Feb 23 00:25:17 crc kubenswrapper[4735]: I0223 00:25:17.770648 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lsmrl/must-gather-v6mtr" event={"ID":"1c577a42-4dda-48e8-9daf-ab693312a53d","Type":"ContainerStarted","Data":"118ce157b18e650cc07998ef4759d58b726457049e3911d2d80dacfb7e269c5c"} Feb 23 00:25:25 crc kubenswrapper[4735]: I0223 00:25:25.849014 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lsmrl/must-gather-v6mtr" event={"ID":"1c577a42-4dda-48e8-9daf-ab693312a53d","Type":"ContainerStarted","Data":"d813e3ba9b56052dc244e260e955466403b00f754b692f8f11387a75040af27f"} Feb 23 00:25:25 crc kubenswrapper[4735]: I0223 00:25:25.849537 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lsmrl/must-gather-v6mtr" event={"ID":"1c577a42-4dda-48e8-9daf-ab693312a53d","Type":"ContainerStarted","Data":"6d94d788ac9a6f97af97752bbc12027ae59276ea10eed2a1844b6ea3f7279ae5"} Feb 23 00:26:11 crc kubenswrapper[4735]: I0223 00:26:11.512253 4735 patch_prober.go:28] interesting pod/machine-config-daemon-blmnv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 00:26:11 crc kubenswrapper[4735]: I0223 00:26:11.512908 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" podUID="1cba474f-2d55-4a07-969f-25e2817a06d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 00:26:13 crc kubenswrapper[4735]: I0223 00:26:13.494323 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-xmz5p_0b661fda-d14e-4491-896c-4d6812a638b5/control-plane-machine-set-operator/0.log" Feb 23 00:26:13 crc kubenswrapper[4735]: I0223 00:26:13.638089 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-4mwf4_f0643721-5a54-4c37-b857-474ed61ef531/kube-rbac-proxy/0.log" Feb 23 00:26:13 crc kubenswrapper[4735]: I0223 00:26:13.672451 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-4mwf4_f0643721-5a54-4c37-b857-474ed61ef531/machine-api-operator/0.log" Feb 23 00:26:27 crc kubenswrapper[4735]: I0223 00:26:27.730470 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-qxczd_70509649-f213-44de-83a0-1be5da4e7a13/cert-manager-controller/0.log" Feb 23 00:26:27 crc kubenswrapper[4735]: I0223 00:26:27.896847 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-lqrn9_2f59de9f-a682-4d8d-87d7-8f36ef7c6fa8/cert-manager-cainjector/0.log" Feb 23 00:26:27 crc kubenswrapper[4735]: I0223 00:26:27.952325 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-28whv_36e0600e-f4a8-4c41-988a-f64e2a5db19f/cert-manager-webhook/0.log" Feb 23 00:26:41 crc kubenswrapper[4735]: I0223 00:26:41.512583 4735 patch_prober.go:28] interesting pod/machine-config-daemon-blmnv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 00:26:41 crc kubenswrapper[4735]: I0223 00:26:41.513265 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" podUID="1cba474f-2d55-4a07-969f-25e2817a06d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 00:26:44 crc kubenswrapper[4735]: I0223 00:26:44.477351 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-t22rv_1e169620-1b41-4184-9268-ff74d8f3e1a5/prometheus-operator/0.log" Feb 23 00:26:44 crc kubenswrapper[4735]: I0223 00:26:44.550789 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7d8ddb755b-qzw7j_8dc28938-196d-418f-8b9e-9e41eca4ee56/prometheus-operator-admission-webhook/0.log" Feb 23 00:26:44 crc kubenswrapper[4735]: I0223 00:26:44.637677 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7d8ddb755b-zz6rp_7a5bb4ed-d256-4bcf-aae6-1959a199d920/prometheus-operator-admission-webhook/0.log" Feb 23 00:26:44 crc kubenswrapper[4735]: I0223 00:26:44.763926 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-8s6qj_bf421747-d273-4d48-bc0c-dd2947ac646a/operator/0.log" Feb 23 00:26:44 crc kubenswrapper[4735]: I0223 00:26:44.837513 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-vm7rb_dc58bb3c-27f0-4384-836e-caf92997ba93/perses-operator/0.log" Feb 23 00:26:59 crc kubenswrapper[4735]: I0223 00:26:59.417587 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv_ac06d817-3d30-4d1b-aa9c-bcff267ad35c/util/0.log" Feb 23 00:26:59 crc kubenswrapper[4735]: I0223 00:26:59.572072 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv_ac06d817-3d30-4d1b-aa9c-bcff267ad35c/util/0.log" Feb 23 00:26:59 crc kubenswrapper[4735]: I0223 00:26:59.592818 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv_ac06d817-3d30-4d1b-aa9c-bcff267ad35c/pull/0.log" Feb 23 00:26:59 crc kubenswrapper[4735]: I0223 00:26:59.613074 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv_ac06d817-3d30-4d1b-aa9c-bcff267ad35c/pull/0.log" Feb 23 00:26:59 crc kubenswrapper[4735]: I0223 00:26:59.779738 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv_ac06d817-3d30-4d1b-aa9c-bcff267ad35c/extract/0.log" Feb 23 00:26:59 crc kubenswrapper[4735]: I0223 00:26:59.783529 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv_ac06d817-3d30-4d1b-aa9c-bcff267ad35c/util/0.log" Feb 23 00:26:59 crc kubenswrapper[4735]: I0223 00:26:59.805505 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_00e596f74c7ff6aa630d3bf44b91123ebafce6c9d7df4104f82e2338f1btjvv_ac06d817-3d30-4d1b-aa9c-bcff267ad35c/pull/0.log" Feb 23 00:26:59 crc kubenswrapper[4735]: I0223 00:26:59.965794 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct_f6b98c30-8554-4610-a645-727013065876/util/0.log" Feb 23 00:27:00 crc kubenswrapper[4735]: I0223 00:27:00.101837 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct_f6b98c30-8554-4610-a645-727013065876/pull/0.log" Feb 23 00:27:00 crc kubenswrapper[4735]: I0223 00:27:00.111503 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct_f6b98c30-8554-4610-a645-727013065876/util/0.log" Feb 23 00:27:00 crc kubenswrapper[4735]: I0223 00:27:00.115000 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct_f6b98c30-8554-4610-a645-727013065876/pull/0.log" Feb 23 00:27:00 crc kubenswrapper[4735]: I0223 00:27:00.254324 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct_f6b98c30-8554-4610-a645-727013065876/util/0.log" Feb 23 00:27:00 crc kubenswrapper[4735]: I0223 00:27:00.256108 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct_f6b98c30-8554-4610-a645-727013065876/pull/0.log" Feb 23 00:27:00 crc kubenswrapper[4735]: I0223 00:27:00.293924 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8ftcbct_f6b98c30-8554-4610-a645-727013065876/extract/0.log" Feb 23 00:27:00 crc kubenswrapper[4735]: I0223 00:27:00.417955 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz_84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684/util/0.log" Feb 23 00:27:00 crc kubenswrapper[4735]: I0223 00:27:00.589824 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz_84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684/util/0.log" Feb 23 00:27:00 crc kubenswrapper[4735]: I0223 00:27:00.621968 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz_84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684/pull/0.log" Feb 23 00:27:00 crc kubenswrapper[4735]: I0223 00:27:00.675744 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz_84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684/pull/0.log" Feb 23 00:27:00 crc kubenswrapper[4735]: I0223 00:27:00.796117 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz_84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684/pull/0.log" Feb 23 00:27:00 crc kubenswrapper[4735]: I0223 00:27:00.799458 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz_84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684/util/0.log" Feb 23 00:27:00 crc kubenswrapper[4735]: I0223 00:27:00.806704 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5jm8dz_84a0d9ea-4fb6-4cce-98d0-54ab1a6d0684/extract/0.log" Feb 23 00:27:00 crc kubenswrapper[4735]: I0223 00:27:00.965104 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8_2367a963-af46-4385-b85d-75ab46713b1f/util/0.log" Feb 23 00:27:01 crc kubenswrapper[4735]: I0223 00:27:01.102322 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8_2367a963-af46-4385-b85d-75ab46713b1f/pull/0.log" Feb 23 00:27:01 crc kubenswrapper[4735]: I0223 00:27:01.127006 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8_2367a963-af46-4385-b85d-75ab46713b1f/pull/0.log" Feb 23 00:27:01 crc kubenswrapper[4735]: I0223 00:27:01.140040 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8_2367a963-af46-4385-b85d-75ab46713b1f/util/0.log" Feb 23 00:27:01 crc kubenswrapper[4735]: I0223 00:27:01.285088 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8_2367a963-af46-4385-b85d-75ab46713b1f/util/0.log" Feb 23 00:27:01 crc kubenswrapper[4735]: I0223 00:27:01.286722 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8_2367a963-af46-4385-b85d-75ab46713b1f/extract/0.log" Feb 23 00:27:01 crc kubenswrapper[4735]: I0223 00:27:01.287625 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08dz6s8_2367a963-af46-4385-b85d-75ab46713b1f/pull/0.log" Feb 23 00:27:01 crc kubenswrapper[4735]: I0223 00:27:01.440808 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4qlrn_8c37bfa2-47b2-493c-a4ff-5342118dcf93/extract-utilities/0.log" Feb 23 00:27:01 crc kubenswrapper[4735]: I0223 00:27:01.598643 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4qlrn_8c37bfa2-47b2-493c-a4ff-5342118dcf93/extract-utilities/0.log" Feb 23 00:27:01 crc kubenswrapper[4735]: I0223 00:27:01.612671 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4qlrn_8c37bfa2-47b2-493c-a4ff-5342118dcf93/extract-content/0.log" Feb 23 00:27:01 crc kubenswrapper[4735]: I0223 00:27:01.649915 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4qlrn_8c37bfa2-47b2-493c-a4ff-5342118dcf93/extract-content/0.log" Feb 23 00:27:01 crc kubenswrapper[4735]: I0223 00:27:01.811582 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4qlrn_8c37bfa2-47b2-493c-a4ff-5342118dcf93/extract-utilities/0.log" Feb 23 00:27:01 crc kubenswrapper[4735]: I0223 00:27:01.828132 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4qlrn_8c37bfa2-47b2-493c-a4ff-5342118dcf93/extract-content/0.log" Feb 23 00:27:02 crc kubenswrapper[4735]: I0223 00:27:02.056069 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-l486w_51e3d9d6-dbc9-4600-a5f0-8e450b62b4c3/extract-utilities/0.log" Feb 23 00:27:02 crc kubenswrapper[4735]: I0223 00:27:02.104659 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-4qlrn_8c37bfa2-47b2-493c-a4ff-5342118dcf93/registry-server/0.log" Feb 23 00:27:02 crc kubenswrapper[4735]: I0223 00:27:02.161874 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-l486w_51e3d9d6-dbc9-4600-a5f0-8e450b62b4c3/extract-utilities/0.log" Feb 23 00:27:02 crc kubenswrapper[4735]: I0223 00:27:02.207499 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-l486w_51e3d9d6-dbc9-4600-a5f0-8e450b62b4c3/extract-content/0.log" Feb 23 00:27:02 crc kubenswrapper[4735]: I0223 00:27:02.208576 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-l486w_51e3d9d6-dbc9-4600-a5f0-8e450b62b4c3/extract-content/0.log" Feb 23 00:27:02 crc kubenswrapper[4735]: I0223 00:27:02.425305 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-l486w_51e3d9d6-dbc9-4600-a5f0-8e450b62b4c3/extract-content/0.log" Feb 23 00:27:02 crc kubenswrapper[4735]: I0223 00:27:02.434205 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-l486w_51e3d9d6-dbc9-4600-a5f0-8e450b62b4c3/extract-utilities/0.log" Feb 23 00:27:02 crc kubenswrapper[4735]: I0223 00:27:02.615609 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-l486w_51e3d9d6-dbc9-4600-a5f0-8e450b62b4c3/registry-server/0.log" Feb 23 00:27:02 crc kubenswrapper[4735]: I0223 00:27:02.625109 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-bcxr6_7c59f527-6557-45fe-9bd0-78a30ba8da40/marketplace-operator/0.log" Feb 23 00:27:02 crc kubenswrapper[4735]: I0223 00:27:02.668404 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dkjz8_71b6652a-e5e4-4ae8-967e-440f3912eca7/extract-utilities/0.log" Feb 23 00:27:02 crc kubenswrapper[4735]: I0223 00:27:02.837335 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dkjz8_71b6652a-e5e4-4ae8-967e-440f3912eca7/extract-content/0.log" Feb 23 00:27:02 crc kubenswrapper[4735]: I0223 00:27:02.839530 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dkjz8_71b6652a-e5e4-4ae8-967e-440f3912eca7/extract-utilities/0.log" Feb 23 00:27:02 crc kubenswrapper[4735]: I0223 00:27:02.842126 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dkjz8_71b6652a-e5e4-4ae8-967e-440f3912eca7/extract-content/0.log" Feb 23 00:27:03 crc kubenswrapper[4735]: I0223 00:27:03.008464 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dkjz8_71b6652a-e5e4-4ae8-967e-440f3912eca7/extract-utilities/0.log" Feb 23 00:27:03 crc kubenswrapper[4735]: I0223 00:27:03.032926 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dkjz8_71b6652a-e5e4-4ae8-967e-440f3912eca7/extract-content/0.log" Feb 23 00:27:03 crc kubenswrapper[4735]: I0223 00:27:03.206504 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dkjz8_71b6652a-e5e4-4ae8-967e-440f3912eca7/registry-server/0.log" Feb 23 00:27:11 crc kubenswrapper[4735]: I0223 00:27:11.512466 4735 patch_prober.go:28] interesting pod/machine-config-daemon-blmnv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 00:27:11 crc kubenswrapper[4735]: I0223 00:27:11.513024 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" podUID="1cba474f-2d55-4a07-969f-25e2817a06d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 00:27:11 crc kubenswrapper[4735]: I0223 00:27:11.513098 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" Feb 23 00:27:11 crc kubenswrapper[4735]: I0223 00:27:11.514080 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f0184932c4f2bec10d1885f359e7f020b9e339d1e4ee18bc097d894c032130ad"} pod="openshift-machine-config-operator/machine-config-daemon-blmnv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 00:27:11 crc kubenswrapper[4735]: I0223 00:27:11.514181 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" podUID="1cba474f-2d55-4a07-969f-25e2817a06d0" containerName="machine-config-daemon" containerID="cri-o://f0184932c4f2bec10d1885f359e7f020b9e339d1e4ee18bc097d894c032130ad" gracePeriod=600 Feb 23 00:27:11 crc kubenswrapper[4735]: I0223 00:27:11.688614 4735 generic.go:334] "Generic (PLEG): container finished" podID="1cba474f-2d55-4a07-969f-25e2817a06d0" containerID="f0184932c4f2bec10d1885f359e7f020b9e339d1e4ee18bc097d894c032130ad" exitCode=0 Feb 23 00:27:11 crc kubenswrapper[4735]: I0223 00:27:11.688693 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" event={"ID":"1cba474f-2d55-4a07-969f-25e2817a06d0","Type":"ContainerDied","Data":"f0184932c4f2bec10d1885f359e7f020b9e339d1e4ee18bc097d894c032130ad"} Feb 23 00:27:11 crc kubenswrapper[4735]: I0223 00:27:11.689020 4735 scope.go:117] "RemoveContainer" containerID="a99b454aab2d4f811a91442480520f4ceedbb717995dcf3a4814eb9e1442c818" Feb 23 00:27:12 crc kubenswrapper[4735]: I0223 00:27:12.700730 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" event={"ID":"1cba474f-2d55-4a07-969f-25e2817a06d0","Type":"ContainerStarted","Data":"971457e5582daa6b8f14e4f6f6978da31f27bd372ce69ca2b19e1e7c57c8fa18"} Feb 23 00:27:12 crc kubenswrapper[4735]: I0223 00:27:12.731221 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lsmrl/must-gather-v6mtr" podStartSLOduration=108.913059718 podStartE2EDuration="1m55.731195189s" podCreationTimestamp="2026-02-23 00:25:17 +0000 UTC" firstStartedPulling="2026-02-23 00:25:17.720883339 +0000 UTC m=+1076.184429310" lastFinishedPulling="2026-02-23 00:25:24.53901881 +0000 UTC m=+1083.002564781" observedRunningTime="2026-02-23 00:25:25.873663071 +0000 UTC m=+1084.337209082" watchObservedRunningTime="2026-02-23 00:27:12.731195189 +0000 UTC m=+1191.194741200" Feb 23 00:27:16 crc kubenswrapper[4735]: I0223 00:27:16.475715 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-t22rv_1e169620-1b41-4184-9268-ff74d8f3e1a5/prometheus-operator/0.log" Feb 23 00:27:16 crc kubenswrapper[4735]: I0223 00:27:16.480920 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7d8ddb755b-zz6rp_7a5bb4ed-d256-4bcf-aae6-1959a199d920/prometheus-operator-admission-webhook/0.log" Feb 23 00:27:16 crc kubenswrapper[4735]: I0223 00:27:16.481866 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7d8ddb755b-qzw7j_8dc28938-196d-418f-8b9e-9e41eca4ee56/prometheus-operator-admission-webhook/0.log" Feb 23 00:27:16 crc kubenswrapper[4735]: I0223 00:27:16.600710 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-8s6qj_bf421747-d273-4d48-bc0c-dd2947ac646a/operator/0.log" Feb 23 00:27:16 crc kubenswrapper[4735]: I0223 00:27:16.642574 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-vm7rb_dc58bb3c-27f0-4384-836e-caf92997ba93/perses-operator/0.log" Feb 23 00:28:06 crc kubenswrapper[4735]: I0223 00:28:06.191876 4735 generic.go:334] "Generic (PLEG): container finished" podID="1c577a42-4dda-48e8-9daf-ab693312a53d" containerID="6d94d788ac9a6f97af97752bbc12027ae59276ea10eed2a1844b6ea3f7279ae5" exitCode=0 Feb 23 00:28:06 crc kubenswrapper[4735]: I0223 00:28:06.192014 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lsmrl/must-gather-v6mtr" event={"ID":"1c577a42-4dda-48e8-9daf-ab693312a53d","Type":"ContainerDied","Data":"6d94d788ac9a6f97af97752bbc12027ae59276ea10eed2a1844b6ea3f7279ae5"} Feb 23 00:28:06 crc kubenswrapper[4735]: I0223 00:28:06.194787 4735 scope.go:117] "RemoveContainer" containerID="6d94d788ac9a6f97af97752bbc12027ae59276ea10eed2a1844b6ea3f7279ae5" Feb 23 00:28:06 crc kubenswrapper[4735]: I0223 00:28:06.620766 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lsmrl_must-gather-v6mtr_1c577a42-4dda-48e8-9daf-ab693312a53d/gather/0.log" Feb 23 00:28:13 crc kubenswrapper[4735]: I0223 00:28:13.466425 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-lsmrl/must-gather-v6mtr"] Feb 23 00:28:13 crc kubenswrapper[4735]: I0223 00:28:13.467187 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-lsmrl/must-gather-v6mtr" podUID="1c577a42-4dda-48e8-9daf-ab693312a53d" containerName="copy" containerID="cri-o://d813e3ba9b56052dc244e260e955466403b00f754b692f8f11387a75040af27f" gracePeriod=2 Feb 23 00:28:13 crc kubenswrapper[4735]: I0223 00:28:13.478150 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-lsmrl/must-gather-v6mtr"] Feb 23 00:28:13 crc kubenswrapper[4735]: I0223 00:28:13.871186 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lsmrl_must-gather-v6mtr_1c577a42-4dda-48e8-9daf-ab693312a53d/copy/0.log" Feb 23 00:28:13 crc kubenswrapper[4735]: I0223 00:28:13.872223 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lsmrl/must-gather-v6mtr" Feb 23 00:28:14 crc kubenswrapper[4735]: I0223 00:28:14.023458 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8znhz\" (UniqueName: \"kubernetes.io/projected/1c577a42-4dda-48e8-9daf-ab693312a53d-kube-api-access-8znhz\") pod \"1c577a42-4dda-48e8-9daf-ab693312a53d\" (UID: \"1c577a42-4dda-48e8-9daf-ab693312a53d\") " Feb 23 00:28:14 crc kubenswrapper[4735]: I0223 00:28:14.023594 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1c577a42-4dda-48e8-9daf-ab693312a53d-must-gather-output\") pod \"1c577a42-4dda-48e8-9daf-ab693312a53d\" (UID: \"1c577a42-4dda-48e8-9daf-ab693312a53d\") " Feb 23 00:28:14 crc kubenswrapper[4735]: I0223 00:28:14.038102 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c577a42-4dda-48e8-9daf-ab693312a53d-kube-api-access-8znhz" (OuterVolumeSpecName: "kube-api-access-8znhz") pod "1c577a42-4dda-48e8-9daf-ab693312a53d" (UID: "1c577a42-4dda-48e8-9daf-ab693312a53d"). InnerVolumeSpecName "kube-api-access-8znhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:28:14 crc kubenswrapper[4735]: I0223 00:28:14.077166 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c577a42-4dda-48e8-9daf-ab693312a53d-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "1c577a42-4dda-48e8-9daf-ab693312a53d" (UID: "1c577a42-4dda-48e8-9daf-ab693312a53d"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:28:14 crc kubenswrapper[4735]: I0223 00:28:14.125551 4735 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1c577a42-4dda-48e8-9daf-ab693312a53d-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 23 00:28:14 crc kubenswrapper[4735]: I0223 00:28:14.125600 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8znhz\" (UniqueName: \"kubernetes.io/projected/1c577a42-4dda-48e8-9daf-ab693312a53d-kube-api-access-8znhz\") on node \"crc\" DevicePath \"\"" Feb 23 00:28:14 crc kubenswrapper[4735]: I0223 00:28:14.276762 4735 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-lsmrl_must-gather-v6mtr_1c577a42-4dda-48e8-9daf-ab693312a53d/copy/0.log" Feb 23 00:28:14 crc kubenswrapper[4735]: I0223 00:28:14.277206 4735 generic.go:334] "Generic (PLEG): container finished" podID="1c577a42-4dda-48e8-9daf-ab693312a53d" containerID="d813e3ba9b56052dc244e260e955466403b00f754b692f8f11387a75040af27f" exitCode=143 Feb 23 00:28:14 crc kubenswrapper[4735]: I0223 00:28:14.277273 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lsmrl/must-gather-v6mtr" Feb 23 00:28:14 crc kubenswrapper[4735]: I0223 00:28:14.279778 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c577a42-4dda-48e8-9daf-ab693312a53d" path="/var/lib/kubelet/pods/1c577a42-4dda-48e8-9daf-ab693312a53d/volumes" Feb 23 00:28:14 crc kubenswrapper[4735]: I0223 00:28:14.280374 4735 scope.go:117] "RemoveContainer" containerID="d813e3ba9b56052dc244e260e955466403b00f754b692f8f11387a75040af27f" Feb 23 00:28:14 crc kubenswrapper[4735]: I0223 00:28:14.304262 4735 scope.go:117] "RemoveContainer" containerID="6d94d788ac9a6f97af97752bbc12027ae59276ea10eed2a1844b6ea3f7279ae5" Feb 23 00:28:14 crc kubenswrapper[4735]: I0223 00:28:14.341276 4735 scope.go:117] "RemoveContainer" containerID="d813e3ba9b56052dc244e260e955466403b00f754b692f8f11387a75040af27f" Feb 23 00:28:14 crc kubenswrapper[4735]: E0223 00:28:14.341765 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d813e3ba9b56052dc244e260e955466403b00f754b692f8f11387a75040af27f\": container with ID starting with d813e3ba9b56052dc244e260e955466403b00f754b692f8f11387a75040af27f not found: ID does not exist" containerID="d813e3ba9b56052dc244e260e955466403b00f754b692f8f11387a75040af27f" Feb 23 00:28:14 crc kubenswrapper[4735]: I0223 00:28:14.341797 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d813e3ba9b56052dc244e260e955466403b00f754b692f8f11387a75040af27f"} err="failed to get container status \"d813e3ba9b56052dc244e260e955466403b00f754b692f8f11387a75040af27f\": rpc error: code = NotFound desc = could not find container \"d813e3ba9b56052dc244e260e955466403b00f754b692f8f11387a75040af27f\": container with ID starting with d813e3ba9b56052dc244e260e955466403b00f754b692f8f11387a75040af27f not found: ID does not exist" Feb 23 00:28:14 crc kubenswrapper[4735]: I0223 00:28:14.341833 4735 scope.go:117] "RemoveContainer" containerID="6d94d788ac9a6f97af97752bbc12027ae59276ea10eed2a1844b6ea3f7279ae5" Feb 23 00:28:14 crc kubenswrapper[4735]: E0223 00:28:14.342285 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d94d788ac9a6f97af97752bbc12027ae59276ea10eed2a1844b6ea3f7279ae5\": container with ID starting with 6d94d788ac9a6f97af97752bbc12027ae59276ea10eed2a1844b6ea3f7279ae5 not found: ID does not exist" containerID="6d94d788ac9a6f97af97752bbc12027ae59276ea10eed2a1844b6ea3f7279ae5" Feb 23 00:28:14 crc kubenswrapper[4735]: I0223 00:28:14.342352 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d94d788ac9a6f97af97752bbc12027ae59276ea10eed2a1844b6ea3f7279ae5"} err="failed to get container status \"6d94d788ac9a6f97af97752bbc12027ae59276ea10eed2a1844b6ea3f7279ae5\": rpc error: code = NotFound desc = could not find container \"6d94d788ac9a6f97af97752bbc12027ae59276ea10eed2a1844b6ea3f7279ae5\": container with ID starting with 6d94d788ac9a6f97af97752bbc12027ae59276ea10eed2a1844b6ea3f7279ae5 not found: ID does not exist" Feb 23 00:29:11 crc kubenswrapper[4735]: I0223 00:29:11.512984 4735 patch_prober.go:28] interesting pod/machine-config-daemon-blmnv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 00:29:11 crc kubenswrapper[4735]: I0223 00:29:11.513578 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" podUID="1cba474f-2d55-4a07-969f-25e2817a06d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 00:29:41 crc kubenswrapper[4735]: I0223 00:29:41.512498 4735 patch_prober.go:28] interesting pod/machine-config-daemon-blmnv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 00:29:41 crc kubenswrapper[4735]: I0223 00:29:41.513403 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" podUID="1cba474f-2d55-4a07-969f-25e2817a06d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 00:30:00 crc kubenswrapper[4735]: I0223 00:30:00.151841 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530110-rl5lk"] Feb 23 00:30:00 crc kubenswrapper[4735]: E0223 00:30:00.152603 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c577a42-4dda-48e8-9daf-ab693312a53d" containerName="copy" Feb 23 00:30:00 crc kubenswrapper[4735]: I0223 00:30:00.152617 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c577a42-4dda-48e8-9daf-ab693312a53d" containerName="copy" Feb 23 00:30:00 crc kubenswrapper[4735]: E0223 00:30:00.152635 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c577a42-4dda-48e8-9daf-ab693312a53d" containerName="gather" Feb 23 00:30:00 crc kubenswrapper[4735]: I0223 00:30:00.152643 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c577a42-4dda-48e8-9daf-ab693312a53d" containerName="gather" Feb 23 00:30:00 crc kubenswrapper[4735]: I0223 00:30:00.152780 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c577a42-4dda-48e8-9daf-ab693312a53d" containerName="gather" Feb 23 00:30:00 crc kubenswrapper[4735]: I0223 00:30:00.152797 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c577a42-4dda-48e8-9daf-ab693312a53d" containerName="copy" Feb 23 00:30:00 crc kubenswrapper[4735]: I0223 00:30:00.153327 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530110-rl5lk" Feb 23 00:30:00 crc kubenswrapper[4735]: I0223 00:30:00.156474 4735 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 23 00:30:00 crc kubenswrapper[4735]: I0223 00:30:00.156632 4735 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 23 00:30:00 crc kubenswrapper[4735]: I0223 00:30:00.187962 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530110-rl5lk"] Feb 23 00:30:00 crc kubenswrapper[4735]: I0223 00:30:00.249635 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdxsc\" (UniqueName: \"kubernetes.io/projected/5057adfe-e9ae-442b-bd9b-824aaed72adf-kube-api-access-xdxsc\") pod \"collect-profiles-29530110-rl5lk\" (UID: \"5057adfe-e9ae-442b-bd9b-824aaed72adf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530110-rl5lk" Feb 23 00:30:00 crc kubenswrapper[4735]: I0223 00:30:00.250117 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5057adfe-e9ae-442b-bd9b-824aaed72adf-config-volume\") pod \"collect-profiles-29530110-rl5lk\" (UID: \"5057adfe-e9ae-442b-bd9b-824aaed72adf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530110-rl5lk" Feb 23 00:30:00 crc kubenswrapper[4735]: I0223 00:30:00.250165 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5057adfe-e9ae-442b-bd9b-824aaed72adf-secret-volume\") pod \"collect-profiles-29530110-rl5lk\" (UID: \"5057adfe-e9ae-442b-bd9b-824aaed72adf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530110-rl5lk" Feb 23 00:30:00 crc kubenswrapper[4735]: I0223 00:30:00.352281 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5057adfe-e9ae-442b-bd9b-824aaed72adf-config-volume\") pod \"collect-profiles-29530110-rl5lk\" (UID: \"5057adfe-e9ae-442b-bd9b-824aaed72adf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530110-rl5lk" Feb 23 00:30:00 crc kubenswrapper[4735]: I0223 00:30:00.352361 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5057adfe-e9ae-442b-bd9b-824aaed72adf-secret-volume\") pod \"collect-profiles-29530110-rl5lk\" (UID: \"5057adfe-e9ae-442b-bd9b-824aaed72adf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530110-rl5lk" Feb 23 00:30:00 crc kubenswrapper[4735]: I0223 00:30:00.352522 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdxsc\" (UniqueName: \"kubernetes.io/projected/5057adfe-e9ae-442b-bd9b-824aaed72adf-kube-api-access-xdxsc\") pod \"collect-profiles-29530110-rl5lk\" (UID: \"5057adfe-e9ae-442b-bd9b-824aaed72adf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530110-rl5lk" Feb 23 00:30:00 crc kubenswrapper[4735]: I0223 00:30:00.354086 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5057adfe-e9ae-442b-bd9b-824aaed72adf-config-volume\") pod \"collect-profiles-29530110-rl5lk\" (UID: \"5057adfe-e9ae-442b-bd9b-824aaed72adf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530110-rl5lk" Feb 23 00:30:00 crc kubenswrapper[4735]: I0223 00:30:00.366790 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5057adfe-e9ae-442b-bd9b-824aaed72adf-secret-volume\") pod \"collect-profiles-29530110-rl5lk\" (UID: \"5057adfe-e9ae-442b-bd9b-824aaed72adf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530110-rl5lk" Feb 23 00:30:00 crc kubenswrapper[4735]: I0223 00:30:00.389608 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdxsc\" (UniqueName: \"kubernetes.io/projected/5057adfe-e9ae-442b-bd9b-824aaed72adf-kube-api-access-xdxsc\") pod \"collect-profiles-29530110-rl5lk\" (UID: \"5057adfe-e9ae-442b-bd9b-824aaed72adf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29530110-rl5lk" Feb 23 00:30:00 crc kubenswrapper[4735]: I0223 00:30:00.489805 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530110-rl5lk" Feb 23 00:30:00 crc kubenswrapper[4735]: I0223 00:30:00.768213 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29530110-rl5lk"] Feb 23 00:30:01 crc kubenswrapper[4735]: I0223 00:30:01.278982 4735 generic.go:334] "Generic (PLEG): container finished" podID="5057adfe-e9ae-442b-bd9b-824aaed72adf" containerID="38bc119206600f1fed4c05b0123fc3fa728c6f36461d8879c9ebe163347f4c67" exitCode=0 Feb 23 00:30:01 crc kubenswrapper[4735]: I0223 00:30:01.279048 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530110-rl5lk" event={"ID":"5057adfe-e9ae-442b-bd9b-824aaed72adf","Type":"ContainerDied","Data":"38bc119206600f1fed4c05b0123fc3fa728c6f36461d8879c9ebe163347f4c67"} Feb 23 00:30:01 crc kubenswrapper[4735]: I0223 00:30:01.279088 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530110-rl5lk" event={"ID":"5057adfe-e9ae-442b-bd9b-824aaed72adf","Type":"ContainerStarted","Data":"6b222c6354b4827bd07045ad70ba8c42bf5973589fd0332da52aa6bec78b3949"} Feb 23 00:30:02 crc kubenswrapper[4735]: I0223 00:30:02.628130 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530110-rl5lk" Feb 23 00:30:02 crc kubenswrapper[4735]: I0223 00:30:02.796696 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdxsc\" (UniqueName: \"kubernetes.io/projected/5057adfe-e9ae-442b-bd9b-824aaed72adf-kube-api-access-xdxsc\") pod \"5057adfe-e9ae-442b-bd9b-824aaed72adf\" (UID: \"5057adfe-e9ae-442b-bd9b-824aaed72adf\") " Feb 23 00:30:02 crc kubenswrapper[4735]: I0223 00:30:02.796820 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5057adfe-e9ae-442b-bd9b-824aaed72adf-config-volume\") pod \"5057adfe-e9ae-442b-bd9b-824aaed72adf\" (UID: \"5057adfe-e9ae-442b-bd9b-824aaed72adf\") " Feb 23 00:30:02 crc kubenswrapper[4735]: I0223 00:30:02.796954 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5057adfe-e9ae-442b-bd9b-824aaed72adf-secret-volume\") pod \"5057adfe-e9ae-442b-bd9b-824aaed72adf\" (UID: \"5057adfe-e9ae-442b-bd9b-824aaed72adf\") " Feb 23 00:30:02 crc kubenswrapper[4735]: I0223 00:30:02.799574 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5057adfe-e9ae-442b-bd9b-824aaed72adf-config-volume" (OuterVolumeSpecName: "config-volume") pod "5057adfe-e9ae-442b-bd9b-824aaed72adf" (UID: "5057adfe-e9ae-442b-bd9b-824aaed72adf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 23 00:30:02 crc kubenswrapper[4735]: I0223 00:30:02.802959 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5057adfe-e9ae-442b-bd9b-824aaed72adf-kube-api-access-xdxsc" (OuterVolumeSpecName: "kube-api-access-xdxsc") pod "5057adfe-e9ae-442b-bd9b-824aaed72adf" (UID: "5057adfe-e9ae-442b-bd9b-824aaed72adf"). InnerVolumeSpecName "kube-api-access-xdxsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:30:02 crc kubenswrapper[4735]: I0223 00:30:02.803782 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5057adfe-e9ae-442b-bd9b-824aaed72adf-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5057adfe-e9ae-442b-bd9b-824aaed72adf" (UID: "5057adfe-e9ae-442b-bd9b-824aaed72adf"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 23 00:30:02 crc kubenswrapper[4735]: I0223 00:30:02.899544 4735 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5057adfe-e9ae-442b-bd9b-824aaed72adf-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 23 00:30:02 crc kubenswrapper[4735]: I0223 00:30:02.899598 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdxsc\" (UniqueName: \"kubernetes.io/projected/5057adfe-e9ae-442b-bd9b-824aaed72adf-kube-api-access-xdxsc\") on node \"crc\" DevicePath \"\"" Feb 23 00:30:02 crc kubenswrapper[4735]: I0223 00:30:02.899620 4735 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5057adfe-e9ae-442b-bd9b-824aaed72adf-config-volume\") on node \"crc\" DevicePath \"\"" Feb 23 00:30:03 crc kubenswrapper[4735]: I0223 00:30:03.314916 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29530110-rl5lk" event={"ID":"5057adfe-e9ae-442b-bd9b-824aaed72adf","Type":"ContainerDied","Data":"6b222c6354b4827bd07045ad70ba8c42bf5973589fd0332da52aa6bec78b3949"} Feb 23 00:30:03 crc kubenswrapper[4735]: I0223 00:30:03.314970 4735 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b222c6354b4827bd07045ad70ba8c42bf5973589fd0332da52aa6bec78b3949" Feb 23 00:30:03 crc kubenswrapper[4735]: I0223 00:30:03.314997 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29530110-rl5lk" Feb 23 00:30:11 crc kubenswrapper[4735]: I0223 00:30:11.513000 4735 patch_prober.go:28] interesting pod/machine-config-daemon-blmnv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 00:30:11 crc kubenswrapper[4735]: I0223 00:30:11.513877 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" podUID="1cba474f-2d55-4a07-969f-25e2817a06d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 00:30:11 crc kubenswrapper[4735]: I0223 00:30:11.513956 4735 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" Feb 23 00:30:11 crc kubenswrapper[4735]: I0223 00:30:11.515150 4735 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"971457e5582daa6b8f14e4f6f6978da31f27bd372ce69ca2b19e1e7c57c8fa18"} pod="openshift-machine-config-operator/machine-config-daemon-blmnv" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 23 00:30:11 crc kubenswrapper[4735]: I0223 00:30:11.515328 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" podUID="1cba474f-2d55-4a07-969f-25e2817a06d0" containerName="machine-config-daemon" containerID="cri-o://971457e5582daa6b8f14e4f6f6978da31f27bd372ce69ca2b19e1e7c57c8fa18" gracePeriod=600 Feb 23 00:30:12 crc kubenswrapper[4735]: I0223 00:30:12.399387 4735 generic.go:334] "Generic (PLEG): container finished" podID="1cba474f-2d55-4a07-969f-25e2817a06d0" containerID="971457e5582daa6b8f14e4f6f6978da31f27bd372ce69ca2b19e1e7c57c8fa18" exitCode=0 Feb 23 00:30:12 crc kubenswrapper[4735]: I0223 00:30:12.399467 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" event={"ID":"1cba474f-2d55-4a07-969f-25e2817a06d0","Type":"ContainerDied","Data":"971457e5582daa6b8f14e4f6f6978da31f27bd372ce69ca2b19e1e7c57c8fa18"} Feb 23 00:30:12 crc kubenswrapper[4735]: I0223 00:30:12.400296 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" event={"ID":"1cba474f-2d55-4a07-969f-25e2817a06d0","Type":"ContainerStarted","Data":"40cdd7cf85eb8c28e6799578b20f149a8acc120b13ed5684f87c3e21eadc2a61"} Feb 23 00:30:12 crc kubenswrapper[4735]: I0223 00:30:12.400339 4735 scope.go:117] "RemoveContainer" containerID="f0184932c4f2bec10d1885f359e7f020b9e339d1e4ee18bc097d894c032130ad" Feb 23 00:31:50 crc kubenswrapper[4735]: I0223 00:31:50.268141 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-72lkj"] Feb 23 00:31:50 crc kubenswrapper[4735]: E0223 00:31:50.269145 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5057adfe-e9ae-442b-bd9b-824aaed72adf" containerName="collect-profiles" Feb 23 00:31:50 crc kubenswrapper[4735]: I0223 00:31:50.269166 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="5057adfe-e9ae-442b-bd9b-824aaed72adf" containerName="collect-profiles" Feb 23 00:31:50 crc kubenswrapper[4735]: I0223 00:31:50.269413 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="5057adfe-e9ae-442b-bd9b-824aaed72adf" containerName="collect-profiles" Feb 23 00:31:50 crc kubenswrapper[4735]: I0223 00:31:50.271152 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-72lkj" Feb 23 00:31:50 crc kubenswrapper[4735]: I0223 00:31:50.296185 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-72lkj"] Feb 23 00:31:50 crc kubenswrapper[4735]: I0223 00:31:50.318240 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a150df6d-7047-4876-abab-189540f366fb-utilities\") pod \"community-operators-72lkj\" (UID: \"a150df6d-7047-4876-abab-189540f366fb\") " pod="openshift-marketplace/community-operators-72lkj" Feb 23 00:31:50 crc kubenswrapper[4735]: I0223 00:31:50.318422 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a150df6d-7047-4876-abab-189540f366fb-catalog-content\") pod \"community-operators-72lkj\" (UID: \"a150df6d-7047-4876-abab-189540f366fb\") " pod="openshift-marketplace/community-operators-72lkj" Feb 23 00:31:50 crc kubenswrapper[4735]: I0223 00:31:50.318578 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqwhr\" (UniqueName: \"kubernetes.io/projected/a150df6d-7047-4876-abab-189540f366fb-kube-api-access-hqwhr\") pod \"community-operators-72lkj\" (UID: \"a150df6d-7047-4876-abab-189540f366fb\") " pod="openshift-marketplace/community-operators-72lkj" Feb 23 00:31:50 crc kubenswrapper[4735]: I0223 00:31:50.421606 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a150df6d-7047-4876-abab-189540f366fb-catalog-content\") pod \"community-operators-72lkj\" (UID: \"a150df6d-7047-4876-abab-189540f366fb\") " pod="openshift-marketplace/community-operators-72lkj" Feb 23 00:31:50 crc kubenswrapper[4735]: I0223 00:31:50.424618 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a150df6d-7047-4876-abab-189540f366fb-catalog-content\") pod \"community-operators-72lkj\" (UID: \"a150df6d-7047-4876-abab-189540f366fb\") " pod="openshift-marketplace/community-operators-72lkj" Feb 23 00:31:50 crc kubenswrapper[4735]: I0223 00:31:50.424751 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqwhr\" (UniqueName: \"kubernetes.io/projected/a150df6d-7047-4876-abab-189540f366fb-kube-api-access-hqwhr\") pod \"community-operators-72lkj\" (UID: \"a150df6d-7047-4876-abab-189540f366fb\") " pod="openshift-marketplace/community-operators-72lkj" Feb 23 00:31:50 crc kubenswrapper[4735]: I0223 00:31:50.425015 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a150df6d-7047-4876-abab-189540f366fb-utilities\") pod \"community-operators-72lkj\" (UID: \"a150df6d-7047-4876-abab-189540f366fb\") " pod="openshift-marketplace/community-operators-72lkj" Feb 23 00:31:50 crc kubenswrapper[4735]: I0223 00:31:50.425795 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a150df6d-7047-4876-abab-189540f366fb-utilities\") pod \"community-operators-72lkj\" (UID: \"a150df6d-7047-4876-abab-189540f366fb\") " pod="openshift-marketplace/community-operators-72lkj" Feb 23 00:31:50 crc kubenswrapper[4735]: I0223 00:31:50.458844 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqwhr\" (UniqueName: \"kubernetes.io/projected/a150df6d-7047-4876-abab-189540f366fb-kube-api-access-hqwhr\") pod \"community-operators-72lkj\" (UID: \"a150df6d-7047-4876-abab-189540f366fb\") " pod="openshift-marketplace/community-operators-72lkj" Feb 23 00:31:50 crc kubenswrapper[4735]: I0223 00:31:50.594306 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-72lkj" Feb 23 00:31:51 crc kubenswrapper[4735]: I0223 00:31:51.060256 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-72lkj"] Feb 23 00:31:51 crc kubenswrapper[4735]: W0223 00:31:51.069181 4735 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda150df6d_7047_4876_abab_189540f366fb.slice/crio-80e028d329c50500347009f9bae314c3b949424899eb79a26504cd4dc12c84f2 WatchSource:0}: Error finding container 80e028d329c50500347009f9bae314c3b949424899eb79a26504cd4dc12c84f2: Status 404 returned error can't find the container with id 80e028d329c50500347009f9bae314c3b949424899eb79a26504cd4dc12c84f2 Feb 23 00:31:51 crc kubenswrapper[4735]: I0223 00:31:51.325440 4735 generic.go:334] "Generic (PLEG): container finished" podID="a150df6d-7047-4876-abab-189540f366fb" containerID="3e43f16d01ca0cdac694be0eca04bbaf920ee26735e72affc08f4d3717cfd7bd" exitCode=0 Feb 23 00:31:51 crc kubenswrapper[4735]: I0223 00:31:51.325486 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-72lkj" event={"ID":"a150df6d-7047-4876-abab-189540f366fb","Type":"ContainerDied","Data":"3e43f16d01ca0cdac694be0eca04bbaf920ee26735e72affc08f4d3717cfd7bd"} Feb 23 00:31:51 crc kubenswrapper[4735]: I0223 00:31:51.325538 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-72lkj" event={"ID":"a150df6d-7047-4876-abab-189540f366fb","Type":"ContainerStarted","Data":"80e028d329c50500347009f9bae314c3b949424899eb79a26504cd4dc12c84f2"} Feb 23 00:31:51 crc kubenswrapper[4735]: I0223 00:31:51.326901 4735 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 23 00:31:52 crc kubenswrapper[4735]: I0223 00:31:52.337738 4735 generic.go:334] "Generic (PLEG): container finished" podID="a150df6d-7047-4876-abab-189540f366fb" containerID="505c34aeac2b570b60333a1776ad0038b5f4618367111a3c14d2eadbe03e9470" exitCode=0 Feb 23 00:31:52 crc kubenswrapper[4735]: I0223 00:31:52.338005 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-72lkj" event={"ID":"a150df6d-7047-4876-abab-189540f366fb","Type":"ContainerDied","Data":"505c34aeac2b570b60333a1776ad0038b5f4618367111a3c14d2eadbe03e9470"} Feb 23 00:31:53 crc kubenswrapper[4735]: I0223 00:31:53.354667 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-72lkj" event={"ID":"a150df6d-7047-4876-abab-189540f366fb","Type":"ContainerStarted","Data":"82200e4bbe9b2e3fb4a935afdb53d439890fa7dc7cfd5feb35a4f674c5b2be6a"} Feb 23 00:31:53 crc kubenswrapper[4735]: I0223 00:31:53.391668 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-72lkj" podStartSLOduration=1.986636531 podStartE2EDuration="3.391638103s" podCreationTimestamp="2026-02-23 00:31:50 +0000 UTC" firstStartedPulling="2026-02-23 00:31:51.326691466 +0000 UTC m=+1469.790237437" lastFinishedPulling="2026-02-23 00:31:52.731692998 +0000 UTC m=+1471.195239009" observedRunningTime="2026-02-23 00:31:53.381402762 +0000 UTC m=+1471.844948733" watchObservedRunningTime="2026-02-23 00:31:53.391638103 +0000 UTC m=+1471.855184114" Feb 23 00:32:00 crc kubenswrapper[4735]: I0223 00:32:00.596232 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-72lkj" Feb 23 00:32:00 crc kubenswrapper[4735]: I0223 00:32:00.597077 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-72lkj" Feb 23 00:32:00 crc kubenswrapper[4735]: I0223 00:32:00.668408 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-72lkj" Feb 23 00:32:01 crc kubenswrapper[4735]: I0223 00:32:01.499277 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-72lkj" Feb 23 00:32:01 crc kubenswrapper[4735]: I0223 00:32:01.569772 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-72lkj"] Feb 23 00:32:03 crc kubenswrapper[4735]: I0223 00:32:03.441555 4735 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-72lkj" podUID="a150df6d-7047-4876-abab-189540f366fb" containerName="registry-server" containerID="cri-o://82200e4bbe9b2e3fb4a935afdb53d439890fa7dc7cfd5feb35a4f674c5b2be6a" gracePeriod=2 Feb 23 00:32:03 crc kubenswrapper[4735]: I0223 00:32:03.909520 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-72lkj" Feb 23 00:32:03 crc kubenswrapper[4735]: I0223 00:32:03.974530 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqwhr\" (UniqueName: \"kubernetes.io/projected/a150df6d-7047-4876-abab-189540f366fb-kube-api-access-hqwhr\") pod \"a150df6d-7047-4876-abab-189540f366fb\" (UID: \"a150df6d-7047-4876-abab-189540f366fb\") " Feb 23 00:32:03 crc kubenswrapper[4735]: I0223 00:32:03.975057 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a150df6d-7047-4876-abab-189540f366fb-catalog-content\") pod \"a150df6d-7047-4876-abab-189540f366fb\" (UID: \"a150df6d-7047-4876-abab-189540f366fb\") " Feb 23 00:32:03 crc kubenswrapper[4735]: I0223 00:32:03.975324 4735 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a150df6d-7047-4876-abab-189540f366fb-utilities\") pod \"a150df6d-7047-4876-abab-189540f366fb\" (UID: \"a150df6d-7047-4876-abab-189540f366fb\") " Feb 23 00:32:03 crc kubenswrapper[4735]: I0223 00:32:03.977121 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a150df6d-7047-4876-abab-189540f366fb-utilities" (OuterVolumeSpecName: "utilities") pod "a150df6d-7047-4876-abab-189540f366fb" (UID: "a150df6d-7047-4876-abab-189540f366fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:32:03 crc kubenswrapper[4735]: I0223 00:32:03.985485 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a150df6d-7047-4876-abab-189540f366fb-kube-api-access-hqwhr" (OuterVolumeSpecName: "kube-api-access-hqwhr") pod "a150df6d-7047-4876-abab-189540f366fb" (UID: "a150df6d-7047-4876-abab-189540f366fb"). InnerVolumeSpecName "kube-api-access-hqwhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 23 00:32:04 crc kubenswrapper[4735]: I0223 00:32:04.059438 4735 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a150df6d-7047-4876-abab-189540f366fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a150df6d-7047-4876-abab-189540f366fb" (UID: "a150df6d-7047-4876-abab-189540f366fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 23 00:32:04 crc kubenswrapper[4735]: I0223 00:32:04.077938 4735 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a150df6d-7047-4876-abab-189540f366fb-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 23 00:32:04 crc kubenswrapper[4735]: I0223 00:32:04.077987 4735 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a150df6d-7047-4876-abab-189540f366fb-utilities\") on node \"crc\" DevicePath \"\"" Feb 23 00:32:04 crc kubenswrapper[4735]: I0223 00:32:04.078007 4735 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqwhr\" (UniqueName: \"kubernetes.io/projected/a150df6d-7047-4876-abab-189540f366fb-kube-api-access-hqwhr\") on node \"crc\" DevicePath \"\"" Feb 23 00:32:04 crc kubenswrapper[4735]: I0223 00:32:04.455802 4735 generic.go:334] "Generic (PLEG): container finished" podID="a150df6d-7047-4876-abab-189540f366fb" containerID="82200e4bbe9b2e3fb4a935afdb53d439890fa7dc7cfd5feb35a4f674c5b2be6a" exitCode=0 Feb 23 00:32:04 crc kubenswrapper[4735]: I0223 00:32:04.455914 4735 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-72lkj" Feb 23 00:32:04 crc kubenswrapper[4735]: I0223 00:32:04.455906 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-72lkj" event={"ID":"a150df6d-7047-4876-abab-189540f366fb","Type":"ContainerDied","Data":"82200e4bbe9b2e3fb4a935afdb53d439890fa7dc7cfd5feb35a4f674c5b2be6a"} Feb 23 00:32:04 crc kubenswrapper[4735]: I0223 00:32:04.455988 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-72lkj" event={"ID":"a150df6d-7047-4876-abab-189540f366fb","Type":"ContainerDied","Data":"80e028d329c50500347009f9bae314c3b949424899eb79a26504cd4dc12c84f2"} Feb 23 00:32:04 crc kubenswrapper[4735]: I0223 00:32:04.456021 4735 scope.go:117] "RemoveContainer" containerID="82200e4bbe9b2e3fb4a935afdb53d439890fa7dc7cfd5feb35a4f674c5b2be6a" Feb 23 00:32:04 crc kubenswrapper[4735]: I0223 00:32:04.494075 4735 scope.go:117] "RemoveContainer" containerID="505c34aeac2b570b60333a1776ad0038b5f4618367111a3c14d2eadbe03e9470" Feb 23 00:32:04 crc kubenswrapper[4735]: I0223 00:32:04.495264 4735 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-72lkj"] Feb 23 00:32:04 crc kubenswrapper[4735]: I0223 00:32:04.506920 4735 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-72lkj"] Feb 23 00:32:04 crc kubenswrapper[4735]: I0223 00:32:04.521642 4735 scope.go:117] "RemoveContainer" containerID="3e43f16d01ca0cdac694be0eca04bbaf920ee26735e72affc08f4d3717cfd7bd" Feb 23 00:32:04 crc kubenswrapper[4735]: I0223 00:32:04.560747 4735 scope.go:117] "RemoveContainer" containerID="82200e4bbe9b2e3fb4a935afdb53d439890fa7dc7cfd5feb35a4f674c5b2be6a" Feb 23 00:32:04 crc kubenswrapper[4735]: E0223 00:32:04.561389 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82200e4bbe9b2e3fb4a935afdb53d439890fa7dc7cfd5feb35a4f674c5b2be6a\": container with ID starting with 82200e4bbe9b2e3fb4a935afdb53d439890fa7dc7cfd5feb35a4f674c5b2be6a not found: ID does not exist" containerID="82200e4bbe9b2e3fb4a935afdb53d439890fa7dc7cfd5feb35a4f674c5b2be6a" Feb 23 00:32:04 crc kubenswrapper[4735]: I0223 00:32:04.561442 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82200e4bbe9b2e3fb4a935afdb53d439890fa7dc7cfd5feb35a4f674c5b2be6a"} err="failed to get container status \"82200e4bbe9b2e3fb4a935afdb53d439890fa7dc7cfd5feb35a4f674c5b2be6a\": rpc error: code = NotFound desc = could not find container \"82200e4bbe9b2e3fb4a935afdb53d439890fa7dc7cfd5feb35a4f674c5b2be6a\": container with ID starting with 82200e4bbe9b2e3fb4a935afdb53d439890fa7dc7cfd5feb35a4f674c5b2be6a not found: ID does not exist" Feb 23 00:32:04 crc kubenswrapper[4735]: I0223 00:32:04.561476 4735 scope.go:117] "RemoveContainer" containerID="505c34aeac2b570b60333a1776ad0038b5f4618367111a3c14d2eadbe03e9470" Feb 23 00:32:04 crc kubenswrapper[4735]: E0223 00:32:04.562012 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"505c34aeac2b570b60333a1776ad0038b5f4618367111a3c14d2eadbe03e9470\": container with ID starting with 505c34aeac2b570b60333a1776ad0038b5f4618367111a3c14d2eadbe03e9470 not found: ID does not exist" containerID="505c34aeac2b570b60333a1776ad0038b5f4618367111a3c14d2eadbe03e9470" Feb 23 00:32:04 crc kubenswrapper[4735]: I0223 00:32:04.562075 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"505c34aeac2b570b60333a1776ad0038b5f4618367111a3c14d2eadbe03e9470"} err="failed to get container status \"505c34aeac2b570b60333a1776ad0038b5f4618367111a3c14d2eadbe03e9470\": rpc error: code = NotFound desc = could not find container \"505c34aeac2b570b60333a1776ad0038b5f4618367111a3c14d2eadbe03e9470\": container with ID starting with 505c34aeac2b570b60333a1776ad0038b5f4618367111a3c14d2eadbe03e9470 not found: ID does not exist" Feb 23 00:32:04 crc kubenswrapper[4735]: I0223 00:32:04.562111 4735 scope.go:117] "RemoveContainer" containerID="3e43f16d01ca0cdac694be0eca04bbaf920ee26735e72affc08f4d3717cfd7bd" Feb 23 00:32:04 crc kubenswrapper[4735]: E0223 00:32:04.562497 4735 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e43f16d01ca0cdac694be0eca04bbaf920ee26735e72affc08f4d3717cfd7bd\": container with ID starting with 3e43f16d01ca0cdac694be0eca04bbaf920ee26735e72affc08f4d3717cfd7bd not found: ID does not exist" containerID="3e43f16d01ca0cdac694be0eca04bbaf920ee26735e72affc08f4d3717cfd7bd" Feb 23 00:32:04 crc kubenswrapper[4735]: I0223 00:32:04.562524 4735 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e43f16d01ca0cdac694be0eca04bbaf920ee26735e72affc08f4d3717cfd7bd"} err="failed to get container status \"3e43f16d01ca0cdac694be0eca04bbaf920ee26735e72affc08f4d3717cfd7bd\": rpc error: code = NotFound desc = could not find container \"3e43f16d01ca0cdac694be0eca04bbaf920ee26735e72affc08f4d3717cfd7bd\": container with ID starting with 3e43f16d01ca0cdac694be0eca04bbaf920ee26735e72affc08f4d3717cfd7bd not found: ID does not exist" Feb 23 00:32:06 crc kubenswrapper[4735]: I0223 00:32:06.287568 4735 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a150df6d-7047-4876-abab-189540f366fb" path="/var/lib/kubelet/pods/a150df6d-7047-4876-abab-189540f366fb/volumes" Feb 23 00:32:11 crc kubenswrapper[4735]: I0223 00:32:11.512878 4735 patch_prober.go:28] interesting pod/machine-config-daemon-blmnv container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 23 00:32:11 crc kubenswrapper[4735]: I0223 00:32:11.513379 4735 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-blmnv" podUID="1cba474f-2d55-4a07-969f-25e2817a06d0" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 23 00:32:16 crc kubenswrapper[4735]: I0223 00:32:16.251418 4735 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7npzp"] Feb 23 00:32:16 crc kubenswrapper[4735]: E0223 00:32:16.252100 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a150df6d-7047-4876-abab-189540f366fb" containerName="extract-content" Feb 23 00:32:16 crc kubenswrapper[4735]: I0223 00:32:16.252116 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a150df6d-7047-4876-abab-189540f366fb" containerName="extract-content" Feb 23 00:32:16 crc kubenswrapper[4735]: E0223 00:32:16.252137 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a150df6d-7047-4876-abab-189540f366fb" containerName="extract-utilities" Feb 23 00:32:16 crc kubenswrapper[4735]: I0223 00:32:16.252145 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a150df6d-7047-4876-abab-189540f366fb" containerName="extract-utilities" Feb 23 00:32:16 crc kubenswrapper[4735]: E0223 00:32:16.252168 4735 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a150df6d-7047-4876-abab-189540f366fb" containerName="registry-server" Feb 23 00:32:16 crc kubenswrapper[4735]: I0223 00:32:16.252176 4735 state_mem.go:107] "Deleted CPUSet assignment" podUID="a150df6d-7047-4876-abab-189540f366fb" containerName="registry-server" Feb 23 00:32:16 crc kubenswrapper[4735]: I0223 00:32:16.252331 4735 memory_manager.go:354] "RemoveStaleState removing state" podUID="a150df6d-7047-4876-abab-189540f366fb" containerName="registry-server" Feb 23 00:32:16 crc kubenswrapper[4735]: I0223 00:32:16.253422 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7npzp" Feb 23 00:32:16 crc kubenswrapper[4735]: I0223 00:32:16.269554 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7npzp"] Feb 23 00:32:16 crc kubenswrapper[4735]: I0223 00:32:16.417310 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjj9q\" (UniqueName: \"kubernetes.io/projected/c52a990b-88b4-428d-bbdb-7adb42e113c3-kube-api-access-zjj9q\") pod \"redhat-operators-7npzp\" (UID: \"c52a990b-88b4-428d-bbdb-7adb42e113c3\") " pod="openshift-marketplace/redhat-operators-7npzp" Feb 23 00:32:16 crc kubenswrapper[4735]: I0223 00:32:16.417529 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c52a990b-88b4-428d-bbdb-7adb42e113c3-utilities\") pod \"redhat-operators-7npzp\" (UID: \"c52a990b-88b4-428d-bbdb-7adb42e113c3\") " pod="openshift-marketplace/redhat-operators-7npzp" Feb 23 00:32:16 crc kubenswrapper[4735]: I0223 00:32:16.418547 4735 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c52a990b-88b4-428d-bbdb-7adb42e113c3-catalog-content\") pod \"redhat-operators-7npzp\" (UID: \"c52a990b-88b4-428d-bbdb-7adb42e113c3\") " pod="openshift-marketplace/redhat-operators-7npzp" Feb 23 00:32:16 crc kubenswrapper[4735]: I0223 00:32:16.520311 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjj9q\" (UniqueName: \"kubernetes.io/projected/c52a990b-88b4-428d-bbdb-7adb42e113c3-kube-api-access-zjj9q\") pod \"redhat-operators-7npzp\" (UID: \"c52a990b-88b4-428d-bbdb-7adb42e113c3\") " pod="openshift-marketplace/redhat-operators-7npzp" Feb 23 00:32:16 crc kubenswrapper[4735]: I0223 00:32:16.520390 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c52a990b-88b4-428d-bbdb-7adb42e113c3-utilities\") pod \"redhat-operators-7npzp\" (UID: \"c52a990b-88b4-428d-bbdb-7adb42e113c3\") " pod="openshift-marketplace/redhat-operators-7npzp" Feb 23 00:32:16 crc kubenswrapper[4735]: I0223 00:32:16.520814 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c52a990b-88b4-428d-bbdb-7adb42e113c3-utilities\") pod \"redhat-operators-7npzp\" (UID: \"c52a990b-88b4-428d-bbdb-7adb42e113c3\") " pod="openshift-marketplace/redhat-operators-7npzp" Feb 23 00:32:16 crc kubenswrapper[4735]: I0223 00:32:16.520885 4735 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c52a990b-88b4-428d-bbdb-7adb42e113c3-catalog-content\") pod \"redhat-operators-7npzp\" (UID: \"c52a990b-88b4-428d-bbdb-7adb42e113c3\") " pod="openshift-marketplace/redhat-operators-7npzp" Feb 23 00:32:16 crc kubenswrapper[4735]: I0223 00:32:16.521126 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c52a990b-88b4-428d-bbdb-7adb42e113c3-catalog-content\") pod \"redhat-operators-7npzp\" (UID: \"c52a990b-88b4-428d-bbdb-7adb42e113c3\") " pod="openshift-marketplace/redhat-operators-7npzp" Feb 23 00:32:16 crc kubenswrapper[4735]: I0223 00:32:16.550978 4735 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjj9q\" (UniqueName: \"kubernetes.io/projected/c52a990b-88b4-428d-bbdb-7adb42e113c3-kube-api-access-zjj9q\") pod \"redhat-operators-7npzp\" (UID: \"c52a990b-88b4-428d-bbdb-7adb42e113c3\") " pod="openshift-marketplace/redhat-operators-7npzp" Feb 23 00:32:16 crc kubenswrapper[4735]: I0223 00:32:16.602736 4735 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7npzp" Feb 23 00:32:16 crc kubenswrapper[4735]: I0223 00:32:16.860876 4735 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7npzp"] Feb 23 00:32:17 crc kubenswrapper[4735]: I0223 00:32:17.602427 4735 generic.go:334] "Generic (PLEG): container finished" podID="c52a990b-88b4-428d-bbdb-7adb42e113c3" containerID="1bb49053883ed70ad5472c843c201da53f6fe783888456d794c0c7acd33d9298" exitCode=0 Feb 23 00:32:17 crc kubenswrapper[4735]: I0223 00:32:17.602467 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7npzp" event={"ID":"c52a990b-88b4-428d-bbdb-7adb42e113c3","Type":"ContainerDied","Data":"1bb49053883ed70ad5472c843c201da53f6fe783888456d794c0c7acd33d9298"} Feb 23 00:32:17 crc kubenswrapper[4735]: I0223 00:32:17.602496 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7npzp" event={"ID":"c52a990b-88b4-428d-bbdb-7adb42e113c3","Type":"ContainerStarted","Data":"672957601ef0d2451e35e2567598e172296e246646c63507e3febe20982e8b6a"} Feb 23 00:32:19 crc kubenswrapper[4735]: I0223 00:32:19.633791 4735 generic.go:334] "Generic (PLEG): container finished" podID="c52a990b-88b4-428d-bbdb-7adb42e113c3" containerID="d32917c4fa8a13ac58329513f97f5ced73310f1d8a6c7cb981e971666bcee003" exitCode=0 Feb 23 00:32:19 crc kubenswrapper[4735]: I0223 00:32:19.634596 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7npzp" event={"ID":"c52a990b-88b4-428d-bbdb-7adb42e113c3","Type":"ContainerDied","Data":"d32917c4fa8a13ac58329513f97f5ced73310f1d8a6c7cb981e971666bcee003"} Feb 23 00:32:20 crc kubenswrapper[4735]: I0223 00:32:20.646438 4735 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7npzp" event={"ID":"c52a990b-88b4-428d-bbdb-7adb42e113c3","Type":"ContainerStarted","Data":"7383f2a5985d29a24f9aca2e84135cb0a914aacc77c34c90c0c9d6ba655eea84"} Feb 23 00:32:20 crc kubenswrapper[4735]: I0223 00:32:20.677048 4735 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7npzp" podStartSLOduration=2.123528415 podStartE2EDuration="4.677003759s" podCreationTimestamp="2026-02-23 00:32:16 +0000 UTC" firstStartedPulling="2026-02-23 00:32:17.604455503 +0000 UTC m=+1496.068001464" lastFinishedPulling="2026-02-23 00:32:20.157930807 +0000 UTC m=+1498.621476808" observedRunningTime="2026-02-23 00:32:20.671905303 +0000 UTC m=+1499.135451364" watchObservedRunningTime="2026-02-23 00:32:20.677003759 +0000 UTC m=+1499.140549740" Feb 23 00:32:26 crc kubenswrapper[4735]: I0223 00:32:26.603044 4735 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7npzp" Feb 23 00:32:26 crc kubenswrapper[4735]: I0223 00:32:26.603541 4735 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7npzp" Feb 23 00:32:27 crc kubenswrapper[4735]: I0223 00:32:27.663963 4735 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7npzp" podUID="c52a990b-88b4-428d-bbdb-7adb42e113c3" containerName="registry-server" probeResult="failure" output=< Feb 23 00:32:27 crc kubenswrapper[4735]: timeout: failed to connect service ":50051" within 1s Feb 23 00:32:27 crc kubenswrapper[4735]: > var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515146720053024450 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015146720054017366 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015146714532016515 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015146714532015465 5ustar corecore